Effective Emotional Model of Pet-type Robot in Interactive Emotion Communication

Size: px
Start display at page:

Download "Effective Emotional Model of Pet-type Robot in Interactive Emotion Communication"

Transcription

1 SCIS & ISIS 200, Dec. 8-2, 200, Okayama Convention Center, Okayama, Japan Effective Emotional Model of Pet-type Robot in Interactive Emotion Communication Ryohei Taki, Yoichiro Maeda and Yasutake Takahashi Dept. of Human and Artificial Intelligent Systems Graduate School of Engineering, University of Fukui 3-9- Bunkyo, Fukui, Japan (rtaki, maeda, Abstract In order for the robot and human to live together, the flexible understanding ability of human intention is required for the robot. In this research, we aim to realize Interactive Emotion Communication (IEC) which is a bidirectional communication based on emotional behaviors between human and robot. The purpose of IEC is to raise the personal affinity which the robot gives to the human by interactive emotional behaviors between both of them. IEC consists of three processes: () recognizing human emotion, (2) generating robot emotion, and (3) expressing robot emotion. In this research, we mainly developed the process of generating robot emotion. We consider that patterns of emotional behaviors desired to the robot vary from person to person in IEC. In this research, we perform the individual preference analysis for emotional behaviors. I. INTRODUCTION Recently, opportunities when a robot contacts human are increasing [], [2], therefore the technology for interactive communication with human is gradually needed. In addition, the flexible understanding ability of human intension and the expressing ability of robot intension are required for the robot to live together. Technology for realizing the interactive communication between human and robot has not been established yet, therefore there are few robots which communicate to human smoothly. In order to understand the human intention and express the robot intention, some researches used the nonverbal information have been proposed [3 7]. If there are difference between the verbal communication and the nonverbal communication when we convey our emotion and attitude, nonverbal communication includes over 90% information for the emotion of interlocutor [8]. There are various kinds of nonverbal communication that is eye sign, voice, expression, gesture, and so on. On the other hand, we feel the unpleasantness for the robot expression like human [9]. We have been performed the research on the nonverbal communication based on human and robot behaviors. In this research, a method of emotion inference from the behavior of human is proposed [0]. At first, the body feature of a subject is extracted based on the Laban s theory []. Next, we obtain the basic emotional degree [0] by fuzzy inference using extracted human body feature. Finally, the emotion value of human behavior is evaluated based on the Russell s Circumplex Model [2]. We describe the outline of these theories in section III. In this research, we aim to realize Interactive Emotion Communication (we call IEC) which is a bidirectional communication based on emotional behavior between human and robot. As a result of the analysis we aim to give high interpersonal affinity of robot to human. Moreover, we report on the image of communication between human and robot. We can conjecture that patterns of emotional behaviors demanded from robot are different by person in IEC. This research experiments individual preference analysis for emotional behavior each of subject. II. INTERACTIVE EMOTION COMMUNICATION (IEC) This research assumes the bidirectional communication model through emotional behaviors between human and robot as one example of several nonverbal communications. The emotional behavior means the gesture or dance [3] to represent emotion to an opposite person. We assume that there are a human A and a robot B here. First of all, human A generates something an emotion and expresses the emotional behavior to robot B by his gesture (Step). Next, B recognizes A s emotion through his vision ability (Step2). Then B generates his emotion and expresses emotional behaviors to A (Step3). Moreover A is cheer up by B (Step4). In this way, the interactive communication by emotional behaviors is constructed between A and B. To realize IEC, the following three processes become necessary. ) Recognizing human s emotion 2) Generating robot s emotion 3) Expressing robot s emotion We call this bidirectional communication Interactive Emotion Communication (IEC). The human communicates through languages, but it is very hard for the robot to communicate with the human by recent technology. Therefore we consider that the human and robot enable to communicate smoothly through emotions. In the process of generating robot s emotion in this paper, the robot emotional behavior is decided based on the evaluation of the personal preference analysis after section IV. Fig. shows the conceptual drawing of IEC. In this research we proposed a new trial of the communication between an ability of the proposed method. Our final goal is 99

2 A Step I'm sad. I became happy. Step4 Fig.. [Recognizing Emotion] Interactive Emotion Communication Step2 He seems to be sad. B [Generating Emotion] I will make him happy. [Expressing Emotion] Step3 Interactive Emotion Communication (IEC) to build the robot which is able to recognize human s emotion and express its emotion by bidirectional communication based on IEC model with high interpersonal affinity. III. FUZZY EMOTION INFERENCE SYSTEM (FEIS) Mainly in this chapter, we explain the first one of abovementioned IEC recognizing human s emotion process which inferences human s emotion from emotional behaviors of human. We used the fuzzy inference to infer the human emotion from the human body feature. We adapted Laban s theory and Russell s Circumplex Model to decrease the input and output dimension in the fuzzy inference and make the fuzzy rules simply. A. Laban s Theory Laban s theory [] proposed by R. Laban is a method to extract the macro features from human body motions. This method has three types of description about motion features, that is, Effort-Shape Description, Motif Description and Structural Description. Above all Effort-Shape Description describes the quality of motions and meaning of the expression. Because it is useful when human body motion is classified according to the visual function, we use this type of description to analyze human body motions. R. Laban proposed the theory that there are bipolar systems based on Fighting Form and Indulging Form in the expression of human body motion. Fighting Form means active and vivid body motion and Indulging Form means slow and gentle body motion. The concept that human motions are subdivided by these forms is Effort-Shape Description. Effort is effective to classify the body motion based on Kansei information. Shape shows the feature of overall static shape of the body motion, moreover Shape do not include considering local motion feature. In this research, we suppose that Time Effort is the speed of the center of gravity of body, Flow Effort is the acceleration of hands, Table-Plane Shape is the area of body and Door-Plane Shape is the height of the center of gravity of body. These features could be measured by a camera of robot. We excluded Weight Effort, Space Effort, Wheel Plane from motion features of Laban s theory because these features are hard to define and measure by a camera. B. Fuzzy Emotion Inference Figure 2, Table I shows membership functions, singletons, and fuzzy rules used in FEIS of this research. Tanabe et al. AS AM AL 0 PL PM PH VS VM VF HS HM HL s s2 s3 s4 0 p p2 p3 p4 0 v v2 v3 v4 0 h h2 h3 h4 (a) Area : La (b) Position : Lp (c) Velocity : Lv (d) Hand Acceleraion : Lh NUL NUM NUS NEU PPS PPM PPL Rx nu3 nu2 nu 0 pp pp2 pp3 (e) Pleasure and Unpleasure : Rx VS VM VF Fig. 2. HS HM HL HS HM HL HS HM HL NSL NSM NSS NEU PAS PAM PAL ns3 ns2 ns Ry 0 pa pa2 pa3 (f) Arousing and Sleepy : Ry Values of Membership Functions and Singltons TABLE I FUZZY EMOTION INFERENCE RULE AS AM AL PL PM PH PL PM PH PL PM PH NUS NEU NEU NEU NEU PPL NEU PPL PPL NSL NSM NEU NSL NSL NSM NSL NSL NSL NUM NUS PPM NEU NEU PPL NEU NEU PPL NSM NEU PAS NSL NSM NEU NSL NSL NSM NUM NUM NUS NEU NEU PPM NEU PPM PPL NSS PAS PAM NSM NEU PAS NSL NSM NSM NUM NUS PPS NEU PPM PPM PPM PPL PPL NSS NSS PAS NSM NSM PAS NSL NSM NEU NUL NEU NUS NUM NEU PPM PPM PPL PPL NSS NEU PAM NEU NEU NEU NSM NEU PAS NUL NUL NUM NUL NUM NEU NUS PPS PPM PAS PAM PAL NSS PAM PAM NSS PAS PAS NUL NUM NUM NUM NEU NEU NUS PPM PPL PAM PAM PAL NSS NEU PAM NSM NSL PAS NUL NUL NUM NUL NEU NUS NUM PPS PPM PAM PAL PAL NEU PAM PAL NSL PAS PAM NUL NUL NUL NUL NUL NUM NUL NUM PPS PAL PAL PAL PAM PAL PAL PAS PAM PAL (Upper Label: Rx, LowerLabel:Ry) proposed the basic theory of this system. The basic emotional degrees extracted from the motion analysis based on Laban s Theory as input values of fuzzy inference are defined in this system. Values of Pleasure and Unpleasure and Arousing and Sleepy axis are decided based on the rule of Table I so that the system obtains an emotion value on Russell s Circumplex Model. C. Russell s Circumplex Model J. A. Russell in 980 proposed the Circumplex Model [2] that all emotions are expressed by the circumplex arrangement on the plane defined by two dimensions: Pleasure and Unpleasure and Arousing and Sleepy. Additional proposal by Witvliet and Vrana [4], [5] which four basic emotions apply to each quadrant of this model is proposed. Therefore we also defined the human emotion by using these four basic emotions in each quadrant as JOY, ANG, SAD, and REL (See Fig.3). In this research, the human emotion is inferred from R x (Pleasure and Unpleasure) and R y (Arousing and Sleepy) obtained by FEIS. We decide the human emotion based on the quadrant which the inference result (R x, R y ) is belonging. The emotion value (E i : i = JOY, ANG, SAD, REL) means an emotional strength in this method. E i is calculated from Eqs.() and (2). 200

3 ANGER Unpleasure TENSE DISTRESSED ANNOYED FRUSTRATED SADNESS ALARMED AROUSED AFRAID ANGRY Arousing Sleepy EXCITED ASTONISHED JOY DELIGHTED GLAD HAPPY PLEASED SARENE MISERABLE CALM DEPRESSED AT EASE SAD RELAXED GLOOMY BORED SLEEPY DROOPY TIRED Ry O (Rx,Ry) Rx Pleasure SATISFIED CONTENT RELAXATION Fig. 3. Basic Emotions on Russell s Circumplex Model (quoted from reference [2]) E i = D. Algorithm Flow R 2 x + R 2 y sin(π 2θ) () θ = arctan R y R x (2) JOY 0 θ< 2 π ANG i = 2 π θ<π SAD π θ< 3 2 π 3 REL 2 π θ<2π Figure 4 shows the procedure of Fuzzy Emotion Inference System (we call FEIS) proposed in this research. FEIS is constructed with the following algorithm. () Measuring human emotional behaviors with CCD camera. (2) Extracting body features from motion analysis based on Laban s Theory. (3) Calculating the basic emotional degree by fuzzy inference using body features. (4) Obtaining the emotion value used Russell s Circumplex Model from the basic emotional degree. (5) Expressing robot emotional behaviors based on the emotion value. In this research four basic emotions (Joy: JOY, Anger: ANG, Sadness: SAD, Relaxation: REL) are used as known as the basic emotion in Japan. In the following sections, we discuss human s and robot s emotions only about this four types of emotions. IV. EXPERIMENT OF IEC We were able to infer human emotions by above-mentioned FEIS, additionally attempt the interactive experiment between human and pet-type robot. In this experiment, we define each basic emotional behavior expressed by human as JOY-H, ANG-H, SAD-H, and REL-H, each basic emotional behavior expressed by robot as JOY-R, ANG-R, SAD-R, and REL- R, and emotion estimated with the inference from human emotional behaviors by FEIS as JOY-F, ANG-F, SAD-F, and REL-F. A. Experimental Environment We used AIBO (SONY ERS-7) as a pet-type robot in this experiment because we think that a pet-type robot possesses high interpersonal affinity. The environment of this experiment is shown in Fig.5 which shows an example of the communication between a subject and a robot through the computer in case of JOY-H. Table II shows values of membership function (See Fig.2) used in this experiment. We made subjects express various behaviors and executed questionnaires for them to decide these membership functions and singleton values. FEIS was constructed on the computer in this experiment so that we gather huge image data (3 to 5 frames/sec) of human s expression ways. Moreover the outputs of FEIS are sent to the robot through wireless LAN. In addition we asked an observer person to evaluate the accuracy of FEIS. B. Experimental Precondition This experiment was contributed to subjects two university students of 20 generations. Subjects attached five markers that each color was different to his head, both hands and both foots in order to extract body features by the computer. We made subjects perform emotional behaviors freely without (2) Body Feature (3) Basic Emotional Degree (4) Russell's Laban's Motion Analysis Fuzzy Inference Circumplex Model Input Output Motion Measurement Emotion Value Human () Emotional Behavior Robot (5) Fig. 4. Fuzzy Emotion Inference System (FEIS) Fig. 5. Experimental Environment (in case of JOY-H) 20

4 TABLE II MEMBERSHIP FUNCTION AND SINGLETON VALUES La Lp Lv Lh s = 50 p =85 v =5 h =20 s2 = 300 p2 = 70 v2 =0 h2 =45 s3 = 450 p3 = 200 v3 =25 h3 =90 s4 = 700 p4 = 300 v4 = 50 h4 = 200 Rx Ry pp = 00 nu = 00 pa = 00 ns = 00 pp2 = 200 nu2 = 200 pa2 = 200 ns2 = 200 pp3 = 300 nu3 = 300 pa3 = 300 ns3 = 300 restrictions of the way to express each basic emotion. Subject expresses emotional behaviors while he images various situations as shown in Table III. In order to realize a kind of natural living situation, we ask two subjects to think about the situations which were imaged easily by them in advance. Table III shows the emotional situations imaged by two subjects. In this research, at first, emotional behaviors of subject are measured by the web camera connected with the personal computer (See Fig.5). Next, the human emotions from emotional behaviors are recognized by FEIS. The observer plays a role of checking the output of FEIS at real time. Furthermore, the output of FEIS is sent to the robot through the wireless LAN. Finally, the robot expresses emotional behaviors according to the output of FEIS. Emotional behaviors expressed by the robot were restricted to one motion of 4 patterns (JOY-R, ANG-R, SAD-R, REL- R), and each behavior was performed within 3 to 6 seconds. In order to simplify the experiment, we assumed that robot emotional behaviors are the following actions. JOY-R : Robot holds up both hands cheerfully. ANG-R : Robot flings both hands to the ground. SAD-R : Robot droops one s head. REL-R : Robot stretches its arms and legs. Subject person observes a robot in front of him as well as expressing emotional behaviors, and the questionnaire on his impression hold in each robot reaction was investigated after the experiment. Beforehand the subject has known which emotion each emotional behavior expressed, so that the subject is able to understand robot s emotion. The experimental flow is shown as follows. Step : We constructed 6 emotional models (See Table IV) of generating robot s emotion. TABLE III EMOTIONAL SITUATIONS IMAGED BY TWO SUBJECTS Subjects Emotions Situations JOY-H Trial thing was successful. A ANG-H Trial thing was unsuccessful. SAD-H Precious item was broken. REL-H He takes some hot drink. JOY-H His desire was satisfied. B ANG-H He felt insulted. SAD-H He was betrayed. REL-H He absorbed himself in hobby. Model Model Model 2 Model 3 Model 4 Model 5 Model 6 TABLE IV 6EMOTIONAL MODELS IN EXPERIMENT Contents Robot expresses each emotional behavior with same relationship to the result of FEIS, that is, robot expresses as same emotion as human expressed. Robot expresses random emotional behaviors without the relationship to the result of FEIS. Robot expresses emotional behaviors with positive evaluation value. Robot expresses top two emotional behaviors with positive evaluation value. Robot expresses only emotional behavior with the best evaluation. Robot expresses only emotional behavior of the best model of another subject. Step-a : We have already performed the experiment of personal preference analysis [7] and selected the generating robot s emotion from 6 patterns (See Table V and Table VI). Step-b : We constructed effective models (model 4 to 6) of generating robot s emotion as subject A s and B s preference. Step2 : We performed the experiment of Interactive Emotion Communication between human and robot. Step2-a : Subject expresses emotional behaviors wearing with 5 color markers for one minute per each emotion while reminding an emotional situation (See Table III). Step2-b : Web camera obtains images of subject s emotional behavior and the human emotion from emotional behaviors is recognized by FEIS in computer. Step2-c : Computer generates robot s emotion according to emotional model (See Table V and Table VI) Step2-d : Computer sends robot s emotions to the robot. Step2-e : Robot expresses emotional behaviors. Step2-f : Subject observes the robot while he expresses emotional behaviors. Step2-g : This experiment is repeated from Step2-a to Step2-f. Step3 : The questionnaire on his impression hold in each emotional behavior of robot reactions was investigated after the experiment for the subject. C. Impression of Robot Emotional Behavior In this experiment we evaluate each emotional model constructed in the advance experiment [7]. We prepared 6 adjective pairs, Animal-like Mechanical (S ), Interesting Boring (S 2 ), Complex Simple (S 3 ), Familiar Unfamiliar (S 4 ), Natural Unnatural (S 5 ), and Likable Dislikable (S 6 ) for the evaluation of questionnaire, and subjects evaluate robot reactions with 7 grades score ( 3 to 3). We calculated the evaluation value σ because we recognize the likability degree of both subjects. σ which is the weighted 202

5 TABLE V EMOTIONAL MODEL (SUBJECT A) TABLE VI EMOTIONAL MODEL (SUBJECT B) Model Model Model 2 Model 3 Model 4 Model 5 Model 6 Subject Emotion Generate Percentage JOY-R ANG-R SAD-R REL-R JOY-H 00% ANG-H - 00% - - SAD-H % - REL-H % JOY-H 25% 25% 25% 25% ANG-H 25% 25% 25% 25% SAD-H 25% 25% 25% 25% REL-H 25% 25% 25% 25% JOY-H 43% 20% 27% 0% ANG-H - 80% 20% - SAD-H 45% 3% 29% 3% REL-H - 2% 2% 77% JOY-H 62% - 38% - ANG-H - 80% 20% - SAD-H 6% - 39% - REL-H - - 2% 79% JOY-H 00% ANG-H - 00% - - SAD-H 00% REL-H % JOY-H 50% 50% - - ANG-H % - SAD-H 60% - 40% - REL-H - 54% - 46% Model Model Model 2 Model 3 Model 4 Model 5 Model 6 Subject Emotion Generate Percentage JOY-R ANG-R SAD-R REL-R JOY-H 00% ANG-H - 00% - - SAD-H % - REL-H % JOY-H 25% 25% 25% 25% ANG-H 25% 25% 25% 25% SAD-H 25% 25% 25% 25% REL-H 25% 25% 25% 25% JOY-H 50% 50% - - ANG-H % - SAD-H 34% 2% 23% 22% REL-H 24% 4% - 35% JOY-H 50% 50% - - ANG-H % - SAD-H 60% - 40% - REL-H - 54% - 46% JOY-H 00% ANG-H % - SAD-H 00% REL-H - 00% - - JOY-H 00% ANG-H - 00% - - SAD-H 00% REL-H % sum of subject s evaluation score S i (i =,,6) in questionnaire is calculated as shown in Eq.(3). σ = 6 α i S i (3) i= Moreover we asked both subjects the significance weight α i (i =,,6) for 6 adjective pairs in questionnaire in order to calculate σ. In advance, this significance weight is calculated by the percentage of significant factor for each adjective pair obtained by the questionnaire to subjects. The weights α i are expressed with the percentage for Animal-like (α ), Interesting (α 2 ), Complex (α 3 ), Familiar (α 4 ), Natural (α 5 ), and Likable (α 6 ). After the advance experiment [7], we obtained (α, α 2, α 3, α 4, α 5, α 6 )=(0.5, 0.05, 0.5, 0.5, 0.0, 0.5) for subject A, (0.5, 0., 0.025, 0., 0.25, 0.5) for subject B as the weight of score S i. Subject A s and subject B s evaluation values (σ A, σ B ), which are calculated with Eq.(3), are useful when we compare the personal preference. D. Experimental Results Figure 6 shows subject A s (σ A ) and subject B s (σ B ) evaluation. Subject A s evaluation is shown by the solid line and subject B s evaluation by the broken line in this figure. Moreover Table VII shows each evaluation values for 6 emotional models. In addition, light-gray cells show the worst impression model and dark-gray cells show the best impression model in this table as compared with each evaluation. In these results, model 4 was the best impression and model was the worst impression in subject A, and model 5 was the best impression and model was the worst impression in subject B. Because the effective model (model 4 to 6) were well impression, generating robot s emotion from all combination between human and robot emotion was effective in IEC. Subject A s and B s best model were different, but both subject s worst model were consistent. From these results, we confirmed selecting the robot emotional model by the subject s evaluation value is very useful. The best emotional model was different between two subjects, but the both subject s evaluation values of model 4 to 6 were higher than model to 3. These results were very important because the impression includes the variety of human preference. Hereafter we will inspect the tendency for the preference of the robot emotion by repeating these experiments. V. CONCLUSION In this paper, we inspected the human impression for the robot which performed emotional behaviors in actual as emotional model. Moreover we confirmed that the robot gives different impressions to each subject in various situations. The experimental results indicate some guidelines in order to raise the interpersonal affinity between human and robot. In this research, we have executed the experiment with only two subjects. However, we must perform the next experiment by cooperating with more subjects. In the future, we must construct the system which all processes are performed in the robot. Furthermore, the parameter tuning of fuzzy rules takes a lot of time to adapt for each subject. Therefore, we must develop the system which is easy to construct fuzzy rules even in case of the experiment with many subjects. ACKNOWLEDGMENT This research was partially supported by the Ministry of Education, Science, Sports and Culture, Grant-in-Aid for Scientific Research (C), 200,

6 Fig. 6. Impressions of Two Subjects TABLE VII EVALUATION VALUES OF TWO SUBJECTS Model σ A σ B Model Model Model Model Model Model REFERENCES [] Special Issue of Commercialization of Robotic Research Achievements, Journal of the Robotics Society of Japan, Vol.23, No.2, 2005 (in Japanese) [2] Special Issue of Human Symbiotic System, Journal of Japan Society for Fuzzy Theory and Intelligent Informatics, Vol.2, No.5, 2009 (in Japanese) [3] K. Itoh, H. Miwa, M. Matsumoto, M. Zecca, H. Takanobu, S. Rocdella, M. C. Carrozza, P. Dario and A. Takanishi, Various Emotion Expressions with Emotion Expression Humanoid Robot WE-4RII, Proceeding of the st IEEE Technical Exhibition Based Conference on Robotics and Automation, pp.35-36, 2004 [4] A. Bruce, I. Nourbakhsh and R. Simmons, The Role of Expressiveness and Attention in Human-Robot Interaction, Proceedings of 2002 IEEE International Conference on Robotics and Automation (ICRA 02), Vol.4, pp , 2002 [5] P. Y. Oudeyer, The production and recognition of emotions in speech: features and algorithms, International Journal of Human-Computer Studies, Vol.62, pp.57-83, 2003 [6] M. Kanoh, S. Iwata, S. Kato and H. Itoh, Emotive Facial Expressions of Sensitivity Communication Robot Ifbot, Kansei Engineering International, Vol.5, No.3, pp.35-42, 2005 [7] S. Mitshuyoshi, F. Ren, Y. Tanaka and S. Kuroiwa, Non-verbal Voice Emotion Analysis System, Journal of Innovative Computing, Information and Control, Vol.2, No.4, pp , 2006 [8] A.Mehrabian, Nonverbal Communication, Aldine De Gruyter, 2007 [9] K. F. MackDorman, Androids as an Experimental Apparatus: Why Is There an Uncanny Valley and Can We Exploit It?, Toward Social Mechanisms of Android Science, pp.06-8, 2005 [0] N. Tanabe, Y. Maeda, Emotional Behavior Evaluation Method Used Fuzzy Reasoning for Pet-type Robot, Human and Artificial Intelligence Systems From Control to Autonomy(HART), pp , 2004 [] R. Laban, The Mastery of Movement, Plays, Inc., 97 [2] J. A. Russell, A circumplex model of affect, Journal of Personality and Social Psychology, Vol.39, pp.6-78, 980 [3] T. Nakata, T. Mori and T. Sato, Analysis of impression of Robot Bodily Expression, Journal of Robotics and Mechatronics, Vol.5, No., pp.27-36, 2002 [4] C. V. O. Witvliet, S. R. Vrana, Psychophysiological responses as indices of affective dimensions, Psychophysiology, 32, pp , 995 [5] C. V. O. Witvliet, S. R. Vrana, Emotional imagery, the visual startle, and covariation bias: An affective matching account, Biological Psychology, Vol.52, pp , 2000 [6] E. L. Lehmann, NONPARAMETRICS Statistical Methods Based on Ranks, Holden-Days, Inc., 975 [7] R. Taki, Y. Maeda and Y. Takahashi, Personal Preference Analysis for Emotional Behavior Response of Autonomous Robot in Interactive Emotion Communication, Journal of Advanced Computational Intelligence and Intelligent Informatics, Vol.4, No.7, 200 (in Print) 204

INTERACTIVE EMOTION COMMUNICATION BETWEEN HUMAN AND ROBOT. Received February 2010; revised August 2010

INTERACTIVE EMOTION COMMUNICATION BETWEEN HUMAN AND ROBOT. Received February 2010; revised August 2010 International Journal of Innovative Computing, Information and Control ICIC International c 2 ISSN 349-498 Volume 7, Number 5(B), May 2 pp. 296 297 INTERACTIVE EMOTION COMMUNICATION BETWEEN HUMAN AND ROBOT

More information

Mood-transition-based Emotion Generation Model for the Robot s Personality

Mood-transition-based Emotion Generation Model for the Robot s Personality Proceedings of the 2009 IEEE International Conference on Systems, an, and Cybernetics San Antonio, TX, USA - October 2009 ood-transition-based Emotion Generation odel for the Robot s Personality Chika

More information

Expression of Emotion and Intention by Robot Body Movement

Expression of Emotion and Intention by Robot Body Movement Expression of Emotion and Intention by Robot Body Movement Toru NAKATA, Tomomasa SATO and Taketoshi MORI. Sato Lab., RCAST, Univ. of Tokyo, Komaba 4-6-1, Meguro, Tokyo, 153-8904, JAPAN. Abstract. A framework

More information

Associated Emotion and its Expression in an Entertainment Robot QRIO

Associated Emotion and its Expression in an Entertainment Robot QRIO Associated Emotion and its Expression in an Entertainment Robot QRIO Fumihide Tanaka 1. Kuniaki Noda 1. Tsutomu Sawada 2. Masahiro Fujita 1.2. 1. Life Dynamics Laboratory Preparatory Office, Sony Corporation,

More information

Intent Expression Using Eye Robot for Mascot Robot System

Intent Expression Using Eye Robot for Mascot Robot System Intent Expression Using Eye Robot for Mascot Robot System Yoichi Yamazaki, Fangyan Dong, Yuta Masuda, Yukiko Uehara, Petar Kormushev, Hai An Vu, Phuc Quang Le, and Kaoru Hirota Department of Computational

More information

Understanding the Mechanism of Sonzai-Kan

Understanding the Mechanism of Sonzai-Kan Understanding the Mechanism of Sonzai-Kan ATR Intelligent Robotics and Communication Laboratories Where does the Sonzai-Kan, the feeling of one's presence, such as the atmosphere, the authority, come from?

More information

Learning and Using Models of Kicking Motions for Legged Robots

Learning and Using Models of Kicking Motions for Legged Robots Learning and Using Models of Kicking Motions for Legged Robots Sonia Chernova and Manuela Veloso Computer Science Department Carnegie Mellon University Pittsburgh, PA 15213 {soniac, mmv}@cs.cmu.edu Abstract

More information

HAND-SHAPED INTERFACE FOR INTUITIVE HUMAN- ROBOT COMMUNICATION THROUGH HAPTIC MEDIA

HAND-SHAPED INTERFACE FOR INTUITIVE HUMAN- ROBOT COMMUNICATION THROUGH HAPTIC MEDIA HAND-SHAPED INTERFACE FOR INTUITIVE HUMAN- ROBOT COMMUNICATION THROUGH HAPTIC MEDIA RIKU HIKIJI AND SHUJI HASHIMOTO Department of Applied Physics, School of Science and Engineering, Waseda University 3-4-1

More information

Human Pointing Navigation Interface for Mobile Robot with Spherical Vision System

Human Pointing Navigation Interface for Mobile Robot with Spherical Vision System Paper: Human Pointing Navigation Interface for Mobile Robot with Spherical Vision System Yasutake Takahashi, Kyohei Yoshida, Fuminori Hibino, and Yoichiro Maeda Dept. of Human and Artificial Intelligent

More information

A SURVEY OF SOCIALLY INTERACTIVE ROBOTS

A SURVEY OF SOCIALLY INTERACTIVE ROBOTS A SURVEY OF SOCIALLY INTERACTIVE ROBOTS Terrence Fong, Illah Nourbakhsh, Kerstin Dautenhahn Presented By: Mehwish Alam INTRODUCTION History of Social Robots Social Robots Socially Interactive Robots Why

More information

Learning and Using Models of Kicking Motions for Legged Robots

Learning and Using Models of Kicking Motions for Legged Robots Learning and Using Models of Kicking Motions for Legged Robots Sonia Chernova and Manuela Veloso Computer Science Department Carnegie Mellon University Pittsburgh, PA 15213 {soniac, mmv}@cs.cmu.edu Abstract

More information

HUMAN-ROBOT INTERACTION

HUMAN-ROBOT INTERACTION HUMAN-ROBOT INTERACTION (NO NATURAL LANGUAGE) 5. EMOTION EXPRESSION ANDREA BONARINI ARTIFICIAL INTELLIGENCE A ND ROBOTICS LAB D I P A R T M E N T O D I E L E T T R O N I C A, I N F O R M A Z I O N E E

More information

Acquisition of Human Operation Characteristics for Kite-based Tethered Flying Robot using Human Operation Data

Acquisition of Human Operation Characteristics for Kite-based Tethered Flying Robot using Human Operation Data Acquisition of Human Operation Characteristics for Kite-based Tethered Flying Robot using Human Operation Data Chiaki Todoroki and Yasutake Takahashi Dept. of Human & Artificial Intelligent Systems, Graduate

More information

Nobutsuna Endo 1, Shimpei Momoki 1, Massimiliano Zecca 2,3, Minoru Saito 1, Yu Mizoguchi 1, Kazuko Itoh 3,5, and Atsuo Takanishi 2,4,5

Nobutsuna Endo 1, Shimpei Momoki 1, Massimiliano Zecca 2,3, Minoru Saito 1, Yu Mizoguchi 1, Kazuko Itoh 3,5, and Atsuo Takanishi 2,4,5 2008 IEEE International Conference on Robotics and Automation Pasadena, CA, USA, May 19-23, 2008 Development of Whole-body Emotion Expression Humanoid Robot Nobutsuna Endo 1, Shimpei Momoki 1, Massimiliano

More information

Smooth collision avoidance in human-robot coexisting environment

Smooth collision avoidance in human-robot coexisting environment The 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems October 18-22, 2010, Taipei, Taiwan Smooth collision avoidance in human-robot coexisting environment Yusue Tamura, Tomohiro

More information

Flexible Cooperation between Human and Robot by interpreting Human Intention from Gaze Information

Flexible Cooperation between Human and Robot by interpreting Human Intention from Gaze Information Proceedings of 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems September 28 - October 2, 2004, Sendai, Japan Flexible Cooperation between Human and Robot by interpreting Human

More information

Essay on A Survey of Socially Interactive Robots Authors: Terrence Fong, Illah Nourbakhsh, Kerstin Dautenhahn Summarized by: Mehwish Alam

Essay on A Survey of Socially Interactive Robots Authors: Terrence Fong, Illah Nourbakhsh, Kerstin Dautenhahn Summarized by: Mehwish Alam 1 Introduction Essay on A Survey of Socially Interactive Robots Authors: Terrence Fong, Illah Nourbakhsh, Kerstin Dautenhahn Summarized by: Mehwish Alam 1.1 Social Robots: Definition: Social robots are

More information

Live Hand Gesture Recognition using an Android Device

Live Hand Gesture Recognition using an Android Device Live Hand Gesture Recognition using an Android Device Mr. Yogesh B. Dongare Department of Computer Engineering. G.H.Raisoni College of Engineering and Management, Ahmednagar. Email- yogesh.dongare05@gmail.com

More information

The Use of Social Robot Ono in Robot Assisted Therapy

The Use of Social Robot Ono in Robot Assisted Therapy The Use of Social Robot Ono in Robot Assisted Therapy Cesar Vandevelde 1, Jelle Saldien 1, Maria-Cristina Ciocci 1, Bram Vanderborght 2 1 Ghent University, Dept. of Industrial Systems and Product Design,

More information

MIN-Fakultät Fachbereich Informatik. Universität Hamburg. Socially interactive robots. Christine Upadek. 29 November Christine Upadek 1

MIN-Fakultät Fachbereich Informatik. Universität Hamburg. Socially interactive robots. Christine Upadek. 29 November Christine Upadek 1 Christine Upadek 29 November 2010 Christine Upadek 1 Outline Emotions Kismet - a sociable robot Outlook Christine Upadek 2 Denition Social robots are embodied agents that are part of a heterogeneous group:

More information

HMM-based Error Recovery of Dance Step Selection for Dance Partner Robot

HMM-based Error Recovery of Dance Step Selection for Dance Partner Robot 27 IEEE International Conference on Robotics and Automation Roma, Italy, 1-14 April 27 ThA4.3 HMM-based Error Recovery of Dance Step Selection for Dance Partner Robot Takahiro Takeda, Yasuhisa Hirata,

More information

Body Movement Analysis of Human-Robot Interaction

Body Movement Analysis of Human-Robot Interaction Body Movement Analysis of Human-Robot Interaction Takayuki Kanda, Hiroshi Ishiguro, Michita Imai, and Tetsuo Ono ATR Intelligent Robotics & Communication Laboratories 2-2-2 Hikaridai, Seika-cho, Soraku-gun,

More information

System of Recognizing Human Action by Mining in Time-Series Motion Logs and Applications

System of Recognizing Human Action by Mining in Time-Series Motion Logs and Applications The 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems October 18-22, 2010, Taipei, Taiwan System of Recognizing Human Action by Mining in Time-Series Motion Logs and Applications

More information

THE DEVELOPMENT of domestic and service robots has

THE DEVELOPMENT of domestic and service robots has 1290 IEEE TRANSACTIONS ON CYBERNETICS, VOL. 43, NO. 4, AUGUST 2013 Robotic Emotional Expression Generation Based on Mood Transition and Personality Model Meng-Ju Han, Chia-How Lin, and Kai-Tai Song, Member,

More information

Non Verbal Communication of Emotions in Social Robots

Non Verbal Communication of Emotions in Social Robots Non Verbal Communication of Emotions in Social Robots Aryel Beck Supervisor: Prof. Nadia Thalmann BeingThere Centre, Institute for Media Innovation, Nanyang Technological University, Singapore INTRODUCTION

More information

Generating Personality Character in a Face Robot through Interaction with Human

Generating Personality Character in a Face Robot through Interaction with Human Generating Personality Character in a Face Robot through Interaction with Human F. Iida, M. Tabata and F. Hara Department of Mechanical Engineering Science University of Tokyo - Kagurazaka, Shinjuku-ku,

More information

Tele-Nursing System with Realistic Sensations using Virtual Locomotion Interface

Tele-Nursing System with Realistic Sensations using Virtual Locomotion Interface 6th ERCIM Workshop "User Interfaces for All" Tele-Nursing System with Realistic Sensations using Virtual Locomotion Interface Tsutomu MIYASATO ATR Media Integration & Communications 2-2-2 Hikaridai, Seika-cho,

More information

A Study on Gaze Estimation System using Cross-Channels Electrooculogram Signals

A Study on Gaze Estimation System using Cross-Channels Electrooculogram Signals , March 12-14, 2014, Hong Kong A Study on Gaze Estimation System using Cross-Channels Electrooculogram Signals Mingmin Yan, Hiroki Tamura, and Koichi Tanno Abstract The aim of this study is to present

More information

Evolving High-Dimensional, Adaptive Camera-Based Speed Sensors

Evolving High-Dimensional, Adaptive Camera-Based Speed Sensors In: M.H. Hamza (ed.), Proceedings of the 21st IASTED Conference on Applied Informatics, pp. 1278-128. Held February, 1-1, 2, Insbruck, Austria Evolving High-Dimensional, Adaptive Camera-Based Speed Sensors

More information

The Tele-operation of the Humanoid Robot -Whole Body Operation for Humanoid Robots in Contact with Environment-

The Tele-operation of the Humanoid Robot -Whole Body Operation for Humanoid Robots in Contact with Environment- The Tele-operation of the Humanoid Robot -Whole Body Operation for Humanoid Robots in Contact with Environment- Hitoshi Hasunuma, Kensuke Harada, and Hirohisa Hirukawa System Technology Development Center,

More information

Emotion Based Music Player

Emotion Based Music Player ISSN 2278 0211 (Online) Emotion Based Music Player Nikhil Zaware Tejas Rajgure Amey Bhadang D. D. Sapkal Professor, Department of Computer Engineering, Pune, India Abstract: Facial expression provides

More information

Autonomic gaze control of avatars using voice information in virtual space voice chat system

Autonomic gaze control of avatars using voice information in virtual space voice chat system Autonomic gaze control of avatars using voice information in virtual space voice chat system Kinya Fujita, Toshimitsu Miyajima and Takashi Shimoji Tokyo University of Agriculture and Technology 2-24-16

More information

Implications on Humanoid Robots in Pedagogical Applications from Cross-Cultural Analysis between Japan, Korea, and the USA

Implications on Humanoid Robots in Pedagogical Applications from Cross-Cultural Analysis between Japan, Korea, and the USA Implications on Humanoid Robots in Pedagogical Applications from Cross-Cultural Analysis between Japan, Korea, and the USA Tatsuya Nomura,, No Member, Takayuki Kanda, Member, IEEE, Tomohiro Suzuki, No

More information

The effect of gaze behavior on the attitude towards humanoid robots

The effect of gaze behavior on the attitude towards humanoid robots The effect of gaze behavior on the attitude towards humanoid robots Bachelor Thesis Date: 27-08-2012 Author: Stefan Patelski Supervisors: Raymond H. Cuijpers, Elena Torta Human Technology Interaction Group

More information

Making a Mobile Robot to Express its Mind by Motion Overlap

Making a Mobile Robot to Express its Mind by Motion Overlap 7 Making a Mobile Robot to Express its Mind by Motion Overlap Kazuki Kobayashi 1 and Seiji Yamada 2 1 Shinshu University, 2 National Institute of Informatics Japan 1. Introduction Various home robots like

More information

Context Aware Computing

Context Aware Computing Context Aware Computing Context aware computing: the use of sensors and other sources of information about a user s context to provide more relevant information and services Context independent: acts exactly

More information

Handling Emotions in Human-Computer Dialogues

Handling Emotions in Human-Computer Dialogues Handling Emotions in Human-Computer Dialogues Johannes Pittermann Angela Pittermann Wolfgang Minker Handling Emotions in Human-Computer Dialogues ABC Johannes Pittermann Universität Ulm Inst. Informationstechnik

More information

Robotic Systems ECE 401RB Fall 2007

Robotic Systems ECE 401RB Fall 2007 The following notes are from: Robotic Systems ECE 401RB Fall 2007 Lecture 14: Cooperation among Multiple Robots Part 2 Chapter 12, George A. Bekey, Autonomous Robots: From Biological Inspiration to Implementation

More information

CS295-1 Final Project : AIBO

CS295-1 Final Project : AIBO CS295-1 Final Project : AIBO Mert Akdere, Ethan F. Leland December 20, 2005 Abstract This document is the final report for our CS295-1 Sensor Data Management Course Final Project: Project AIBO. The main

More information

Informing a User of Robot s Mind by Motion

Informing a User of Robot s Mind by Motion Informing a User of Robot s Mind by Motion Kazuki KOBAYASHI 1 and Seiji YAMADA 2,1 1 The Graduate University for Advanced Studies 2-1-2 Hitotsubashi, Chiyoda, Tokyo 101-8430 Japan kazuki@grad.nii.ac.jp

More information

Experimental Investigation into Influence of Negative Attitudes toward Robots on Human Robot Interaction

Experimental Investigation into Influence of Negative Attitudes toward Robots on Human Robot Interaction Experimental Investigation into Influence of Negative Attitudes toward Robots on Human Robot Interaction Tatsuya Nomura 1,2 1 Department of Media Informatics, Ryukoku University 1 5, Yokotani, Setaohe

More information

HRP-2W: A Humanoid Platform for Research on Support Behavior in Daily life Environments

HRP-2W: A Humanoid Platform for Research on Support Behavior in Daily life Environments Book Title Book Editors IOS Press, 2003 1 HRP-2W: A Humanoid Platform for Research on Support Behavior in Daily life Environments Tetsunari Inamura a,1, Masayuki Inaba a and Hirochika Inoue a a Dept. of

More information

Evaluation of a Tricycle-style Teleoperational Interface for Children: a Comparative Experiment with a Video Game Controller

Evaluation of a Tricycle-style Teleoperational Interface for Children: a Comparative Experiment with a Video Game Controller 2012 IEEE RO-MAN: The 21st IEEE International Symposium on Robot and Human Interactive Communication. September 9-13, 2012. Paris, France. Evaluation of a Tricycle-style Teleoperational Interface for Children:

More information

ESTIMATING ROAD TRAFFIC PARAMETERS FROM MOBILE COMMUNICATIONS

ESTIMATING ROAD TRAFFIC PARAMETERS FROM MOBILE COMMUNICATIONS ESTIMATING ROAD TRAFFIC PARAMETERS FROM MOBILE COMMUNICATIONS R. Bolla, F. Davoli, A. Giordano Department of Communications, Computer and Systems Science (DIST University of Genoa Via Opera Pia 13, I-115

More information

Physical and Affective Interaction between Human and Mental Commit Robot

Physical and Affective Interaction between Human and Mental Commit Robot Proceedings of the 21 IEEE International Conference on Robotics & Automation Seoul, Korea May 21-26, 21 Physical and Affective Interaction between Human and Mental Commit Robot Takanori Shibata Kazuo Tanie

More information

Emotional BWI Segway Robot

Emotional BWI Segway Robot Emotional BWI Segway Robot Sangjin Shin https:// github.com/sangjinshin/emotional-bwi-segbot 1. Abstract The Building-Wide Intelligence Project s Segway Robot lacked emotions and personality critical in

More information

Cognitive robots and emotional intelligence Cloud robotics Ethical, legal and social issues of robotic Construction robots Human activities in many

Cognitive robots and emotional intelligence Cloud robotics Ethical, legal and social issues of robotic Construction robots Human activities in many Preface The jubilee 25th International Conference on Robotics in Alpe-Adria-Danube Region, RAAD 2016 was held in the conference centre of the Best Western Hotel M, Belgrade, Serbia, from 30 June to 2 July

More information

EARIN Jarosław Arabas Room #223, Electronics Bldg.

EARIN   Jarosław Arabas Room #223, Electronics Bldg. EARIN http://elektron.elka.pw.edu.pl/~jarabas/earin.html Jarosław Arabas jarabas@elka.pw.edu.pl Room #223, Electronics Bldg. Paweł Cichosz pcichosz@elka.pw.edu.pl Room #215, Electronics Bldg. EARIN Jarosław

More information

Robot Personality based on the Equations of Emotion defined in the 3D Mental Space

Robot Personality based on the Equations of Emotion defined in the 3D Mental Space Proceedings of the 21 IEEE International Conference on Robotics & Automation Seoul, Korea May 2126, 21 Robot based on the Equations of Emotion defined in the 3D Mental Space Hiroyasu Miwa *, Tomohiko Umetsu

More information

Changes of Impression in the Animation Characters with the Different Color and Thickness in Outlines

Changes of Impression in the Animation Characters with the Different Color and Thickness in Outlines KEER2014, LINKÖPING JUNE 11-13 2014 INTERNATIONAL CONFERENCE ON KANSEI ENGINEERING AND EMOTION RESEARCH Changes of Impression in the Animation Characters with the Different Color and Thickness in Outlines

More information

Diseño y Evaluación de Sistemas Interactivos COM Affective Aspects of Interaction Design 19 de Octubre de 2010

Diseño y Evaluación de Sistemas Interactivos COM Affective Aspects of Interaction Design 19 de Octubre de 2010 Diseño y Evaluación de Sistemas Interactivos COM-14112-001 Affective Aspects of Interaction Design 19 de Octubre de 2010 Dr. Víctor M. González y González victor.gonzalez@itam.mx Agenda 1. MexIHC 2010

More information

Available online at ScienceDirect. Procedia Computer Science 76 (2015 ) 2 8

Available online at   ScienceDirect. Procedia Computer Science 76 (2015 ) 2 8 Available online at www.sciencedirect.com ScienceDirect Procedia Computer Science 76 (2015 ) 2 8 2015 IEEE International Symposium on Robotics and Intelligent Sensors (IRIS 2015) Systematic Educational

More information

Effect of Cognitive Biases on Human-Robot Interaction: A Case Study of Robot's Misattribution

Effect of Cognitive Biases on Human-Robot Interaction: A Case Study of Robot's Misattribution Effect of Cognitive Biases on Human-Robot Interaction: A Case Study of Robot's Misattribution Biswas, M. and Murray, J. Abstract This paper presents a model for developing longterm human-robot interactions

More information

Concept and Architecture of a Centaur Robot

Concept and Architecture of a Centaur Robot Concept and Architecture of a Centaur Robot Satoshi Tsuda, Yohsuke Oda, Kuniya Shinozaki, and Ryohei Nakatsu Kwansei Gakuin University, School of Science and Technology 2-1 Gakuen, Sanda, 669-1337 Japan

More information

A Constructive Approach for Communication Robots. Takayuki Kanda

A Constructive Approach for Communication Robots. Takayuki Kanda A Constructive Approach for Communication Robots Takayuki Kanda Abstract In the past several years, many humanoid robots have been developed based on the most advanced robotics technologies. If these

More information

REBO: A LIFE-LIKE UNIVERSAL REMOTE CONTROL

REBO: A LIFE-LIKE UNIVERSAL REMOTE CONTROL World Automation Congress 2010 TSI Press. REBO: A LIFE-LIKE UNIVERSAL REMOTE CONTROL SEIJI YAMADA *1 AND KAZUKI KOBAYASHI *2 *1 National Institute of Informatics / The Graduate University for Advanced

More information

Autonomous Stair Climbing Algorithm for a Small Four-Tracked Robot

Autonomous Stair Climbing Algorithm for a Small Four-Tracked Robot Autonomous Stair Climbing Algorithm for a Small Four-Tracked Robot Quy-Hung Vu, Byeong-Sang Kim, Jae-Bok Song Korea University 1 Anam-dong, Seongbuk-gu, Seoul, Korea vuquyhungbk@yahoo.com, lovidia@korea.ac.kr,

More information

Cooperative Transportation by Humanoid Robots Learning to Correct Positioning

Cooperative Transportation by Humanoid Robots Learning to Correct Positioning Cooperative Transportation by Humanoid Robots Learning to Correct Positioning Yutaka Inoue, Takahiro Tohge, Hitoshi Iba Department of Frontier Informatics, Graduate School of Frontier Sciences, The University

More information

Development and Evaluation of a Centaur Robot

Development and Evaluation of a Centaur Robot Development and Evaluation of a Centaur Robot 1 Satoshi Tsuda, 1 Kuniya Shinozaki, and 2 Ryohei Nakatsu 1 Kwansei Gakuin University, School of Science and Technology 2-1 Gakuen, Sanda, 669-1337 Japan {amy65823,

More information

The Relationship between the Arrangement of Participants and the Comfortableness of Conversation in HyperMirror

The Relationship between the Arrangement of Participants and the Comfortableness of Conversation in HyperMirror The Relationship between the Arrangement of Participants and the Comfortableness of Conversation in HyperMirror Osamu Morikawa 1 and Takanori Maesako 2 1 Research Institute for Human Science and Biomedical

More information

Real-time Reconstruction of Wide-Angle Images from Past Image-Frames with Adaptive Depth Models

Real-time Reconstruction of Wide-Angle Images from Past Image-Frames with Adaptive Depth Models Real-time Reconstruction of Wide-Angle Images from Past Image-Frames with Adaptive Depth Models Kenji Honda, Naoki Hashinoto, Makoto Sato Precision and Intelligence Laboratory, Tokyo Institute of Technology

More information

Analysis of humanoid appearances in human-robot interaction

Analysis of humanoid appearances in human-robot interaction Analysis of humanoid appearances in human-robot interaction Takayuki Kanda, Takahiro Miyashita, Taku Osada 2, Yuji Haikawa 2, Hiroshi Ishiguro &3 ATR Intelligent Robotics and Communication Labs. 2 Honda

More information

Real-Time Bilateral Control for an Internet-Based Telerobotic System

Real-Time Bilateral Control for an Internet-Based Telerobotic System 708 Real-Time Bilateral Control for an Internet-Based Telerobotic System Jahng-Hyon PARK, Joonyoung PARK and Seungjae MOON There is a growing tendency to use the Internet as the transmission medium of

More information

Development of a Simulator of Environment and Measurement for Autonomous Mobile Robots Considering Camera Characteristics

Development of a Simulator of Environment and Measurement for Autonomous Mobile Robots Considering Camera Characteristics Development of a Simulator of Environment and Measurement for Autonomous Mobile Robots Considering Camera Characteristics Kazunori Asanuma 1, Kazunori Umeda 1, Ryuichi Ueda 2, and Tamio Arai 2 1 Chuo University,

More information

Interaction rule learning with a human partner based on an imitation faculty with a simple visuo-motor mapping

Interaction rule learning with a human partner based on an imitation faculty with a simple visuo-motor mapping Robotics and Autonomous Systems 54 (2006) 414 418 www.elsevier.com/locate/robot Interaction rule learning with a human partner based on an imitation faculty with a simple visuo-motor mapping Masaki Ogino

More information

An Intelligent Robot Based on Emotion Decision Model

An Intelligent Robot Based on Emotion Decision Model An Intelligent Robot Based on Emotion Decision Model Liu Yaofeng * Wang Zhiliang Wang Wei Jiang Xiao School of Information, Beijing University of Science and Technology, Beijing 100083, China *Corresponding

More information

Background Pixel Classification for Motion Detection in Video Image Sequences

Background Pixel Classification for Motion Detection in Video Image Sequences Background Pixel Classification for Motion Detection in Video Image Sequences P. Gil-Jiménez, S. Maldonado-Bascón, R. Gil-Pita, and H. Gómez-Moreno Dpto. de Teoría de la señal y Comunicaciones. Universidad

More information

Birth of An Intelligent Humanoid Robot in Singapore

Birth of An Intelligent Humanoid Robot in Singapore Birth of An Intelligent Humanoid Robot in Singapore Ming Xie Nanyang Technological University Singapore 639798 Email: mmxie@ntu.edu.sg Abstract. Since 1996, we have embarked into the journey of developing

More information

Contents. Mental Commit Robot (Mental Calming Robot) Industrial Robots. In What Way are These Robots Intelligent. Video: Mental Commit Robots

Contents. Mental Commit Robot (Mental Calming Robot) Industrial Robots. In What Way are These Robots Intelligent. Video: Mental Commit Robots Human Robot Interaction for Psychological Enrichment Dr. Takanori Shibata Senior Research Scientist Intelligent Systems Institute National Institute of Advanced Industrial Science and Technology (AIST)

More information

Group Robots Forming a Mechanical Structure - Development of slide motion mechanism and estimation of energy consumption of the structural formation -

Group Robots Forming a Mechanical Structure - Development of slide motion mechanism and estimation of energy consumption of the structural formation - Proceedings 2003 IEEE International Symposium on Computational Intelligence in Robotics and Automation July 16-20, 2003, Kobe, Japan Group Robots Forming a Mechanical Structure - Development of slide motion

More information

Does the Appearance of a Robot Affect Users Ways of Giving Commands and Feedback?

Does the Appearance of a Robot Affect Users Ways of Giving Commands and Feedback? 19th IEEE International Symposium on Robot and Human Interactive Communication Principe di Piemonte - Viareggio, Italy, Sept. 12-15, 2010 Does the Appearance of a Robot Affect Users Ways of Giving Commands

More information

Near Infrared Face Image Quality Assessment System of Video Sequences

Near Infrared Face Image Quality Assessment System of Video Sequences 2011 Sixth International Conference on Image and Graphics Near Infrared Face Image Quality Assessment System of Video Sequences Jianfeng Long College of Electrical and Information Engineering Hunan University

More information

Keywords: Multi-robot adversarial environments, real-time autonomous robots

Keywords: Multi-robot adversarial environments, real-time autonomous robots ROBOT SOCCER: A MULTI-ROBOT CHALLENGE EXTENDED ABSTRACT Manuela M. Veloso School of Computer Science Carnegie Mellon University Pittsburgh, PA 15213, USA veloso@cs.cmu.edu Abstract Robot soccer opened

More information

Touch Perception and Emotional Appraisal for a Virtual Agent

Touch Perception and Emotional Appraisal for a Virtual Agent Touch Perception and Emotional Appraisal for a Virtual Agent Nhung Nguyen, Ipke Wachsmuth, Stefan Kopp Faculty of Technology University of Bielefeld 33594 Bielefeld Germany {nnguyen, ipke, skopp}@techfak.uni-bielefeld.de

More information

Sensor system of a small biped entertainment robot

Sensor system of a small biped entertainment robot Advanced Robotics, Vol. 18, No. 10, pp. 1039 1052 (2004) VSP and Robotics Society of Japan 2004. Also available online - www.vsppub.com Sensor system of a small biped entertainment robot Short paper TATSUZO

More information

University of Toronto. Companion Robot Security. ECE1778 Winter Wei Hao Chang Apper Alexander Hong Programmer

University of Toronto. Companion Robot Security. ECE1778 Winter Wei Hao Chang Apper Alexander Hong Programmer University of Toronto Companion ECE1778 Winter 2015 Creative Applications for Mobile Devices Wei Hao Chang Apper Alexander Hong Programmer April 9, 2015 Contents 1 Introduction 3 1.1 Problem......................................

More information

Prediction of Human s Movement for Collision Avoidance of Mobile Robot

Prediction of Human s Movement for Collision Avoidance of Mobile Robot Prediction of Human s Movement for Collision Avoidance of Mobile Robot Shunsuke Hamasaki, Yusuke Tamura, Atsushi Yamashita and Hajime Asama Abstract In order to operate mobile robot that can coexist with

More information

Haptic Invitation of Textures: An Estimation of Human Touch Motions

Haptic Invitation of Textures: An Estimation of Human Touch Motions Haptic Invitation of Textures: An Estimation of Human Touch Motions Hikaru Nagano, Shogo Okamoto, and Yoji Yamada Department of Mechanical Science and Engineering, Graduate School of Engineering, Nagoya

More information

CONTACT: , ROBOTIC BASED PROJECTS

CONTACT: , ROBOTIC BASED PROJECTS ROBOTIC BASED PROJECTS 1. ADVANCED ROBOTIC PICK AND PLACE ARM AND HAND SYSTEM 2. AN ARTIFICIAL LAND MARK DESIGN BASED ON MOBILE ROBOT LOCALIZATION AND NAVIGATION 3. ANDROID PHONE ACCELEROMETER SENSOR BASED

More information

SIGVerse - A Simulation Platform for Human-Robot Interaction Jeffrey Too Chuan TAN and Tetsunari INAMURA National Institute of Informatics, Japan The

SIGVerse - A Simulation Platform for Human-Robot Interaction Jeffrey Too Chuan TAN and Tetsunari INAMURA National Institute of Informatics, Japan The SIGVerse - A Simulation Platform for Human-Robot Interaction Jeffrey Too Chuan TAN and Tetsunari INAMURA National Institute of Informatics, Japan The 29 th Annual Conference of The Robotics Society of

More information

The Advent of New Information Content

The Advent of New Information Content Special Edition on 21st Century Solutions Solutions for the 21st Century Takahiro OD* bstract In the past few years, accompanying the explosive proliferation of the, the setting for information provision

More information

The description of team KIKS

The description of team KIKS The description of team KIKS Keitaro YAMAUCHI 1, Takamichi YOSHIMOTO 2, Takashi HORII 3, Takeshi CHIKU 4, Masato WATANABE 5,Kazuaki ITOH 6 and Toko SUGIURA 7 Toyota National College of Technology Department

More information

Human-Robot Collaborative Dance

Human-Robot Collaborative Dance Human-Robot Collaborative Dance Nikhil Baheti, Kim Baraka, Paul Calhoun, and Letian Zhang Mentor: Prof. Manuela Veloso 16-662: Robot autonomy Final project presentation April 27, 2016 Motivation - Work

More information

RoboCup: Not Only a Robotics Soccer Game but also a New Market Created for Future

RoboCup: Not Only a Robotics Soccer Game but also a New Market Created for Future RoboCup: Not Only a Robotics Soccer Game but also a New Market Created for Future Kuo-Yang Tu Institute of Systems and Control Engineering National Kaohsiung First University of Science and Technology

More information

Young Children s Folk Knowledge of Robots

Young Children s Folk Knowledge of Robots Young Children s Folk Knowledge of Robots Nobuko Katayama College of letters, Ritsumeikan University 56-1, Tojiin Kitamachi, Kita, Kyoto, 603-8577, Japan E-mail: komorin731@yahoo.co.jp Jun ichi Katayama

More information

Analog Implementation of Neo-Fuzzy Neuron and Its On-board Learning

Analog Implementation of Neo-Fuzzy Neuron and Its On-board Learning Analog Implementation of Neo-Fuzzy Neuron and Its On-board Learning TSUTOMU MIKI and TAKESHI YAMAKAWA Department of Control Engineering and Science Kyushu Institute of Technology 68-4 Kawazu, Iizuka, Fukuoka

More information

Controlling Humanoid Robot Using Head Movements

Controlling Humanoid Robot Using Head Movements Volume-5, Issue-2, April-2015 International Journal of Engineering and Management Research Page Number: 648-652 Controlling Humanoid Robot Using Head Movements S. Mounica 1, A. Naga bhavani 2, Namani.Niharika

More information

Optic Flow Based Skill Learning for A Humanoid to Trap, Approach to, and Pass a Ball

Optic Flow Based Skill Learning for A Humanoid to Trap, Approach to, and Pass a Ball Optic Flow Based Skill Learning for A Humanoid to Trap, Approach to, and Pass a Ball Masaki Ogino 1, Masaaki Kikuchi 1, Jun ichiro Ooga 1, Masahiro Aono 1 and Minoru Asada 1,2 1 Dept. of Adaptive Machine

More information

A NOVEL IMAGE PROCESSING TECHNIQUE TO EXTRACT FACIAL EXPRESSIONS FROM MOUTH REGIONS

A NOVEL IMAGE PROCESSING TECHNIQUE TO EXTRACT FACIAL EXPRESSIONS FROM MOUTH REGIONS A NOVEL IMAGE PROCESSING TECHNIQUE TO EXTRACT FACIAL EXPRESSIONS FROM MOUTH REGIONS S.Sowmiya 1, Dr.K.Krishnaveni 2 1 Student, Department of Computer Science 2 1, 2 Associate Professor, Department of Computer

More information

* Intelli Robotic Wheel Chair for Specialty Operations & Physically Challenged

* Intelli Robotic Wheel Chair for Specialty Operations & Physically Challenged ADVANCED ROBOTICS SOLUTIONS * Intelli Mobile Robot for Multi Specialty Operations * Advanced Robotic Pick and Place Arm and Hand System * Automatic Color Sensing Robot using PC * AI Based Image Capturing

More information

Concept and Architecture of a Centaur Robot

Concept and Architecture of a Centaur Robot Concept and Architecture of a Centaur Robot Satoshi Tsuda, Yohsuke Oda, Kuniya Shinozaki, and Ryohei Nakatsu Kwansei Gakuin University, School of Science and Technology 2-1 Gakuen, Sanda, 669-1337 Japan

More information

IN MOST human robot coordination systems that have

IN MOST human robot coordination systems that have IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS, VOL. 54, NO. 2, APRIL 2007 699 Dance Step Estimation Method Based on HMM for Dance Partner Robot Takahiro Takeda, Student Member, IEEE, Yasuhisa Hirata, Member,

More information

Social Acceptance of Humanoid Robots

Social Acceptance of Humanoid Robots Social Acceptance of Humanoid Robots Tatsuya Nomura Department of Media Informatics, Ryukoku University, Japan nomura@rins.ryukoku.ac.jp 2012/11/29 1 Contents Acceptance of Humanoid Robots Technology Acceptance

More information

Converting Motion between Different Types of Humanoid Robots Using Genetic Algorithms

Converting Motion between Different Types of Humanoid Robots Using Genetic Algorithms Converting Motion between Different Types of Humanoid Robots Using Genetic Algorithms Mari Nishiyama and Hitoshi Iba Abstract The imitation between different types of robots remains an unsolved task for

More information

Facial Caricaturing Robot COOPER in EXPO 2005

Facial Caricaturing Robot COOPER in EXPO 2005 Facial Caricaturing Robot COOPER in EXPO 2005 Takayuki Fujiwara, Takashi Watanabe, Takuma Funahashi, Hiroyasu Koshimizu and Katsuya Suzuki School of Information Sciences and Technology Chukyo University

More information

Robot: Geminoid F This android robot looks just like a woman

Robot: Geminoid F This android robot looks just like a woman ProfileArticle Robot: Geminoid F This android robot looks just like a woman For the complete profile with media resources, visit: http://education.nationalgeographic.org/news/robot-geminoid-f/ Program

More information

On Intelligence Jeff Hawkins

On Intelligence Jeff Hawkins On Intelligence Jeff Hawkins Chapter 8: The Future of Intelligence April 27, 2006 Presented by: Melanie Swan, Futurist MS Futures Group 650-681-9482 m@melanieswan.com http://www.melanieswan.com Building

More information

STREAK DETECTION ALGORITHM FOR SPACE DEBRIS DETECTION ON OPTICAL IMAGES

STREAK DETECTION ALGORITHM FOR SPACE DEBRIS DETECTION ON OPTICAL IMAGES STREAK DETECTION ALGORITHM FOR SPACE DEBRIS DETECTION ON OPTICAL IMAGES Alessandro Vananti, Klaus Schild, Thomas Schildknecht Astronomical Institute, University of Bern, Sidlerstrasse 5, CH-3012 Bern,

More information

EMERGENCE OF COMMUNICATION IN TEAMS OF EMBODIED AND SITUATED AGENTS

EMERGENCE OF COMMUNICATION IN TEAMS OF EMBODIED AND SITUATED AGENTS EMERGENCE OF COMMUNICATION IN TEAMS OF EMBODIED AND SITUATED AGENTS DAVIDE MAROCCO STEFANO NOLFI Institute of Cognitive Science and Technologies, CNR, Via San Martino della Battaglia 44, Rome, 00185, Italy

More information

Adaptive Humanoid Robot Arm Motion Generation by Evolved Neural Controllers

Adaptive Humanoid Robot Arm Motion Generation by Evolved Neural Controllers Proceedings of the 3 rd International Conference on Mechanical Engineering and Mechatronics Prague, Czech Republic, August 14-15, 2014 Paper No. 170 Adaptive Humanoid Robot Arm Motion Generation by Evolved

More information