Robot Personality based on the Equations of Emotion defined in the 3D Mental Space

Size: px
Start display at page:

Download "Robot Personality based on the Equations of Emotion defined in the 3D Mental Space"

Transcription

1 Proceedings of the 21 IEEE International Conference on Robotics & Automation Seoul, Korea May 2126, 21 Robot based on the Equations of Emotion defined in the 3D Mental Space Hiroyasu Miwa *, Tomohiko Umetsu *, Atsuo Takanishi **, ***, Hideaki Takanobu ** *Graduate School of Science and Engineering, Waseda University **Department of Mechanical Engineering, Waseda University ***Humanoid Robotics Institute, Waseda University #5938, 341 Ookubo, Shinjukuku, Tokyo, Japan Tel: , Fax: Abstract The authors have been developing humanlike head robots in order to develop new head mechanisms and functions for a humanoid robot that has the ability to communicate naturally with a human by expressing humanlike emotion. When humans communicate through auditory or visual sensations, they estimate their mental state through not only the information of each sensation but also their mental model. A robot s mental model consists of a 3D mental space having a pleasantness, an activation and a certainty axes. When the robot senses the stimulus, a mental vector moves in this mental space based on the Equations of Emotion. We also proposed a Robot, which consists of the Sensing and the Expression. Further, we assigned and loaded various Sensing Personalities. 1. Introduction At present, most practical robots are being used in manufacturing industry as industrial robots. When humans use industrial robots, they must define the robot s behavior using a complex process and method. Moreover, when a robot needs to do new actions or tasks, the operator must reprogram new commands to the robot. Besides, the operator must know the robot programming language. However people will not use such robots for wide use as personal robots in the future because the complexity of the robot programming language is beyond the ordinary person. It is necessary for a personal robot to communicate naturally with humans by reprogramming itself based on its communication with humans. In communication among humans, personality and affinity matching are extremely important in attaining smooth and effective communication (Fig. 1.1). Each personal robot in the future will have a personality and an affinity that are able to communicate naturally with humans like a human in human communication. The authors have been developing humanlike head robots Human Matching Robot Fig.1.1 Matching in order to develop new head mechanisms and functions for a humanoid robot having the ability to express emotions like a human towards the personality matching between a human and a robot. The human personality has been researched almost humans could talk. The human personality consists of genetic factors and environmental factors. The former are inborn, but the latter are obtained by experiences or learning through out one s life [1][2][3]. In the psychology, consists of many components, and it isn t separated. But, in this study, the authors defined a narrow personality, and think that the Robot is consists of the Sensing and the Expression. The head robot is an area of active research in the field of robotics. Brooks developed a head robot which expresses facial expressions using eyes, eyelids, eyebrows and a mouth. It can communicate with humans using visual information from CCD cameras [4]. Hara developed a head robot that uses fourteen Action Units of Ekman [5][6]. It can recognize a human s facial expression and express the same facial expression [7][8]. The authors were able to produce coordinated headeye motion with V.O.R (VestibularOcular Reflex) [9][1], depth direction using the angle of convergence between the two eyes [11], adjusting to the brightness of an object with the eyelids [12], and a /1/$1. 21 IEEE 262

2 M Mt pleasant asleep Certainty uncertain Facial/Neck Motion Expression Expression Mt t certain Activation Fig D Mental Space Pleasantness humanlike expression of eyebrows, lips and the jaw [13]. Moreover, we realized auditory sensation and cutaneous sensation as human sensations [14]. On the other hand, we introduced the Equations of Emotion and a mental model which has three independent parameters for the robot [13][14]. In this paper, we introduce the robot personality that expresses more humanlike emotional behavior than previous robots. In this paper, we describe a new robot personality. 2. Robot Environment Robot Dynamic Mental Model arousal unpleasant Happiness sleep Happiness sleep Stimulus Recognition Sensing Fig. 2.1 Basic Information Processing Structure Neutral Neutral Disgust Sadness Anger Surprise Fear Fig. 2.3 Emotional Space and Region Mapping of Emotions In communication among humans, personality and affinity matching is extremely important for producing smooth and effective communication. The personal robots of the future will have personalities and affinities in order to communicate naturally with humans. When humans sense the same stimuli, the change in the mental states of different people is different. Even if the mental states are the same, their expression is different. Therefore, we introduced a Robot, which consists of a Sensing and an Expression, that are before and behind the dynamic mental model. When the robot senses stimuli, the robot expresses a facial expression and a neck motion in accordance with the flow chart in Fig We installed a dynamic mental model and a Robot in the robot which is controlled by the flow. We defined the mental space shown in Fig. 2.2 as the mental model of the robot. We consider that the structure of the mental space isn t changed by the robot mechanism or the personality, and has an invariable structure. 2.1 Stimulus Preprocessing WE3RIV senses stimuli from the environment with the sensors shown in Table 3.2. The robot converts the information of the external stimuli into information of a concrete sensation that affects the mental state of the robot. 2.2 Dynamic Mental Model A 3D mental space, which consists of a pleasantness axis, an activation axis and a certainty axis, is defined in WE3RIV. The mental space of WE3RIII is a twoplane space whose certainty axis takes only two of either 1 or 1 [14]. On the other hand, the certainty value of WE3RIV is a continuous value changing from 1 to 1. WE3RIV s 3D mental space is explained in Fig The vector M named Mental Vector expresses the mental state of WE3RIV. The mental vector M is expressed by the Equations of Emotion (2.1) [14]. M t t = M t M M t = (a t, p t, c t ) M = ( a, p, c) (2.1) We have defined the emotion of WE3RIV in the Equations (2.2). We mapped out seven different emotions in the 3D mental space as in Fig We think that Surprise could have possibly occurred when the Activation level rose too much, and we have defined Surprise as the large region of the high Activation level. WE3RIV determines the emotion by the Mental Vector M t passing each region. E = {Happiness, Anger, Disgust, Fear, Sadness, Surprise, Neutral} emotion E (2.2) 2.3 Sensing The Sensing determines how a stimulus works for the components of M of the Equations of Emotion. In the Equations of Emotion, the small differences a, p and c are described as follows: a = f a (S t, a t, p t, c t ) p = f p (S t, a t, p t, c t ) c = f c (S t, a t, p t, c t ) S t = (S Vt, S At, S Tt, S Ht, S Ot ) (2.3) Where, S Vt : Visual Sensation S At : Auditory Sensation S Tt : Sensation S Ht : Temperature Sensation S Ot : Sensation The Equations (2.3) are the mapping functions of the Sensing. We have assigned several specific Sensing Personalities for WE3RIV. WE3RIV was able to show the expressions of the various robot personalities. Table 2.1 shows the basic Sensing 263

3 Personalities which we have assigned. And, it shows whether each stimulus works as a negative or positive incident for the components of M. By changing the mapping functions between the stimuli and M, it is possible to obtain a wide variety of Sensing Personalities. The authors think that the Sensing is very important to a relationship between the stimuli and the emotion, because the mental vector determines a robot s emotion. 2.4 Expression The Expression is a pair with the Sensing in the Robot. It affects the facial expression and the neck motion. Its basic structure is similar to that of the Sensing. Although human beings share the same mental states, their expressions are different. For example, when humans become angry, some will remain calm, while others may clearly express anger. This means a wide variety of Expression Personalities by changing the mapping functions of the Expression even though the mental state of the robot is the same. 2.5 Facial/Neck Motion Control for Expressing Emotion The robot outputs the facial expressions and neck motions based on the emotion and the Expression. We used the Six Basic Facial Expressions of Ekman [5][6] in the robot s facial control, and defined the seven facial patterns, Happiness, Anger, Disgust, Fear, Sadness, Surprise, and Neutral facial expressions. The strength of each facial expression is variable by a fiftygrade proportional interpolation of the differences in location from the Neutral facial expression. WE3RIV has the facial pattern shown in Fig Further, WE3RIV can use neck motions to express certain emotions. Although the neck motion is generated by the stimulus that the robot senses, they are modified by the emotion [14]. 3. Hardware Configuration Fig. 3.1 and Fig. 3.2 present the hardware overview of WE3RIV. WE3RIV has 26DOF (Degrees of Freedom) as shown in Table 3.1 and has sensors shown in Table 3.2 which serve as sensory organs for extrinsic stimuli. The followings are descriptions of each part. 3.1 The Neck, Eyeballs and Eyelids The maximum angular velocity of each axis is similar to a human with 16[deg/s] for the neck, 6[deg/s] for the eyeballs and 9[deg/s] for the eyelids. Furthermore, this robot can blink within.3[s], which is as fast as a human blinks [14]. Fig. 3.1 WE3RIV (whole view) Fig. 3.2 WE3RIV (head part) Table 2.1 Stimulus and Sensing with Mapping Functions for WE3RIV Stimulus Sensation a p c Loose Sight of the Target Discover the Target Visual Dazzling Light Target is Near Periferal Vision Pushed Pushed Strongly Hit Auditory Loud Sound Temperature Heat Ammonia Smoke No Sense * No stimulus * * " " means to converge at "" 3.2 Facial Expression Mechanism with Skin Color Expression We used the same mechanism used for WE3RIII for the lips and the eyebrows [14]. We recently added a facial color expression function to the skin. We used the red EL (Electro Luminescent) sheet, which is a thin and light device and doesn t influence the other devices on the skin, such as the FSR (Force Sensitive Resistor), which is used to detect the external forces Table 3.1 DOF Configuration Part DOF Neck 4 Eyes 4 Eyelids 4 Eyebrows 8 Lips 4 Jaw 1 Lung 1 Total

4 as push, hit and stroke [14]. for tactile sensation. Fig. 3.3 shows the six basic facial expressions and the Neutral facial expression performed by WE3RIV. Moreover, we have added the drunken and the shame facial expressions to WE3RIV, as shown in Fig. 3.4, in addition to the six basic facial expressions. 3.4 Total System Fig. 3.5 shows the total system configuration of WE3RIV. We used four computers (PC/AT compatible). An Ethernet system connects the PC1 and the other three computers. Table 3.3 shows the functions of each PC. PC1 obtains the outputs of the semiconductor gas sensors using a 12 [bit] A/D board to recognize the smell. PC1 determines WE3RIV s mental state according to the information of the visual, auditory, cutaneous, and olfactory sensations. PC2 controls the DC motors of the eyeballs, the neck, the eyelids, the jaw and the lung according to the visual and mental information sent from PC1. In addition, PC2 obtains information from the temperature sensors through a 12 [bit] A/D board, and transmits this to PC1. PC3 controls the eyebrows, lips and facial color. To control the facial color, PC3 sends the reference input to EL sheets through the 12 [bit] D/A board and the inverter circuit. PC4 calculates the direction of the sound from loudness and the sound pressure difference between 3.3 Sensors We also added olfactory sensation to WE3RIV. The olfactory sensation consists of four semiconductor gas sensors. In addition, WE3RIV can recognize the smells of alcohol, ammonia and cigarette smoke. Moreover, WE3RIV recognizes the target position in 3D Space, pursues the target, adapts to the brightness as the visual sensation, and localizes the sound from the loudness and the phase differences in 3D space as the auditory sensation. WE3RIV has tactile sensation and temperature sensation as the human cutaneous sensation, and recognizes not only the magnitude of a force but also the difference in touching behavior such (a) Happiness (d) Sadness (b) Anger (e) Fear (c) Surprise (a) Drunken (b) Shame Fig. 3.4 New Facial Expressions (f) Disgust Table 3.2 Sensors Sensation Device Quantity Vision CCD Camera 2 Auditory Microphone 2 Tacticle FSR 28 Cutaneous Temperature Sensor IC 4 Semiconductor 4 Gas Sensor (g) Neutral Fig. 3.3 Seven Basic Facial Expressions Microphones CCD (R) CCD (L) FSR sensor Thermo sensor Gyro Eyelids Eyeballs Neck Jaw Lung WE3RIV Eyebrows Lips Facial Color PC4 A/D Image Processor PC PC1 A/D PC1 Ethernet A/D Servo Module x 4 Servo Module x 4 Servo Module x 4 D/A PC2 PC2 Servo Module x 1 Servo Module x 1 Stepping Motor Driver x 8 Stepping Motor Driver x 4 Inverter Circuit Parallel I/O D/A Fig. 3.5 System Configuration of WE3RIV 265 PC3 PC3 PC4 Table 3.3 Functions of PC OS Function Visual Sensation Tacticle Sensation Windows 95 Sensation Eyelashes Mental State Neck Motion Eyeballs Motion MSDOS Eyebrows Motion Lung Motion Temprature Sensation Facial Expression MSDOS Face Color Windows 98 Auditory Sensation

5 the right and left, and transmits this information to PC1. 4. Experimental Evaluation 4.1 Sensing Personalities The authors evaluated the effect of the Sensing Personalities. We prepared four Sensing Personalities, and changed them. We assigned them by changing the tactile and olfactory sensation of the Sensing shown in Table 2.1. Table 4.1 shows the Sensing Personalities which we used in the experiments. The following is the process of the experiment. We obtained the robot mental states through the experiments. (1) WE3RIV starts. (2) Pursuing the target. (3) Smelling the alcohol. (4) Hitting the head. (5) Stroking the head. (6) Turning off the light. (7) WE3RIV goes to sleep. Fig. 4.1 shows the resulting trajectory of the mental vector obtained through the experiments. Because the reaction of the robot from the stimuli is changed though the robot is affected by the same stimuli, we took this to mean that the robot has various personalities. Therefore, we confirmed that it is possible to obtain a wide variety of Robot Personalities using the various Sensing Personalities. 4.2 Expressing a Mental State We did an experiment on WE3RIV affected by various stimuli (visual, auditory, cutaneous and olfactory stimuli). We also used the Sensing shown in Table 2.1. Fig. 4.2 shows the results of the experiment. We have confirmed that WE3RIV can dynamically express its mental states, which are changed by the visual, auditory, cutaneous and olfactory stimuli, with facial expressions, facial color, and neck motions. Fig. 4.3 shows an example of the resulted trajectory of the mental vector obtained through the experiment. Because of added the olfactory sensation, WE3RIV can respond to more stimuli than our previous robot WE3RIII. In addition, by adding a facial color expression function as part of the emotional expression, the expression ability of WE3RIV has increased. Therefore, we confirmed that WE3RIV became more humanfriendly than our previous robot WE3RIII. (2) Humanlike head robot WE3RIV realized to express the various robot personalities using the Table 4.1 Sensing Pattern Stimulus Sensation a p c (a) Pattern No.1 (b) Pattern No.2 (c) Pattern No.3 5. Conclusions and Future Work (1) We introduced the Robot that consists of the Sensing and the Expression for expressing robot emotions. (d) Pattern No.4 Fig. 4.1 Results of Sensing Personalities Experiment 266

6 (a) Normal (f) Fear Fig. 4.3 Resulted Trajectory of the Mental Vector in the Experiment (b) Surprise (c) Anger (d) Happiness (g) Smelling (h) Reaction of Alcohol (i) Reaction of Ammonia (e) Sadness (j) Reaction of Cigarette Fig. 4.2 Experimental Evaluation various Sensing Personalities which we have assigned and loaded. (3) WE3RIV changed its mental state caused by visual, auditory, cutaneous, and olfactory stimuli by facial expressions, facial color and neck motions. The Expression is a single type at present. In the future, we will experiment utilizing various Expression Personalities. Acknowledgment A part of this research was done at the Humanoid Robotics Institute (HRI), Waseda University. The authors would like to express thanks to ATR (Advanced Telecommunications Research Institute International), HITACHI, Ltd., MINOLTA Co., Ltd., OKINO Industries, Ltd., SANYO ELECTRIC Co., Ltd., SHARP Corp., SMC Corp., SONY Corp., for their financial support for HRI. References [1] G. W. Allport: Pattern and Growth in, New York: Holt, Rinehart and Winston, 1961 [2] Richard S. Lazarus, Alan Monat: PERSONALITY, 3rd edition, PrenticeHall, Inc, 1979 [3] Saichi Kurachi: Jinkaku Keisei no Shinrigaku (Japanese), Kitaohji Shobo, 1986 [4] Cynthia Breazeal,Brian Scassellati: How to build robots that make friends and influence people, IROS99, 1999 [5] Paul Ekman, Wallace V.Friesen: Facial Action Coding System, Consulting Psychologists Press Inc., 1978 [6] Tsutomu Kudo, P.Ekman, W.V. Friesen; Hyojo Bunseki Nyumon Hyojo ni Kakusareta Imi wo Saguru (Japanese), Seishin Shobo, 1987 [7] Hiroshi Kobayashi, Fumio Hara, et al: Study on Face Robot for Active Human Interface Mechanisms on Face Robot and Facial Expressions of 6 Basic Emotions, the Journal of the Robotics Society of Japan Vol.12 No.1, pp , 1994 [8] Hiroshi Kobayashi, Fumio Hara: Real Time Dynamic Control of 6 Basic Facial Expressions on Face Robot, the Journal of the Robotics Society of Japan Vol.14 No.5, pp , 1996 [9] Kazutaka Mitobe, et al.: Consideration of Associated Movements of head and Eyes to optic and Acoustic Stimulation, The institute of electronics, information and communication engineers, Vol.91, pp.8187, 1992 [1] Laurutis V.P. and Robinson D.A.: The vestibuloocular reflex during human saccadic eye movements, J. Physiol., 373, pp.29233, 1986 [11] Atsuo Takanishi, et al.: Development of an Anthropomorphic HeadEye Robot with Two Eyes, Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, pp.79984, 1997 [12] Atsuo Takanishi, et al.: Development of an Anthropomorphic HeadEye System for a Humanoid RobotRealization of Humanlike HeadEye Motion Using Eyelids Adjusting to Brightness, Proceedings of the IEEE International Conference on Robotics and Automation, pp , 1998 [13] Atsuo Takanishi, et al.: Development of an Anthropomorphic HeadEye Robot WE3RII with Autonomous Facial Expression Mechanism, Proceedings of the IEEE International Conference on Robotics and Automation, pp , 1999 [14] Atsuo Takanishi, et al.: An Anthropomorphic HeadEye Robot expressing Emotions based on Equations of Emotion, Proceedings of the IEEE International Conference on Robotics and Automation, pp , 2 267

Nobutsuna Endo 1, Shimpei Momoki 1, Massimiliano Zecca 2,3, Minoru Saito 1, Yu Mizoguchi 1, Kazuko Itoh 3,5, and Atsuo Takanishi 2,4,5

Nobutsuna Endo 1, Shimpei Momoki 1, Massimiliano Zecca 2,3, Minoru Saito 1, Yu Mizoguchi 1, Kazuko Itoh 3,5, and Atsuo Takanishi 2,4,5 2008 IEEE International Conference on Robotics and Automation Pasadena, CA, USA, May 19-23, 2008 Development of Whole-body Emotion Expression Humanoid Robot Nobutsuna Endo 1, Shimpei Momoki 1, Massimiliano

More information

Associated Emotion and its Expression in an Entertainment Robot QRIO

Associated Emotion and its Expression in an Entertainment Robot QRIO Associated Emotion and its Expression in an Entertainment Robot QRIO Fumihide Tanaka 1. Kuniaki Noda 1. Tsutomu Sawada 2. Masahiro Fujita 1.2. 1. Life Dynamics Laboratory Preparatory Office, Sony Corporation,

More information

Emotional Architecture for the Humanoid Robot Head ROMAN

Emotional Architecture for the Humanoid Robot Head ROMAN Emotional Architecture for the Humanoid Robot Head ROMAN Jochen Hirth Robotics Research Lab Department of Computer Science University of Kaiserslautern Germany Email: j hirth@informatik.uni-kl.de Norbert

More information

Robot Personality from Perceptual Behavior Engine : An Experimental Study

Robot Personality from Perceptual Behavior Engine : An Experimental Study Robot Personality from Perceptual Behavior Engine : An Experimental Study Dongwook Shin, Jangwon Lee, Hun-Sue Lee and Sukhan Lee School of Information and Communication Engineering Sungkyunkwan University

More information

Generating Personality Character in a Face Robot through Interaction with Human

Generating Personality Character in a Face Robot through Interaction with Human Generating Personality Character in a Face Robot through Interaction with Human F. Iida, M. Tabata and F. Hara Department of Mechanical Engineering Science University of Tokyo - Kagurazaka, Shinjuku-ku,

More information

An Intelligent Robot Based on Emotion Decision Model

An Intelligent Robot Based on Emotion Decision Model An Intelligent Robot Based on Emotion Decision Model Liu Yaofeng * Wang Zhiliang Wang Wei Jiang Xiao School of Information, Beijing University of Science and Technology, Beijing 100083, China *Corresponding

More information

Sensor system of a small biped entertainment robot

Sensor system of a small biped entertainment robot Advanced Robotics, Vol. 18, No. 10, pp. 1039 1052 (2004) VSP and Robotics Society of Japan 2004. Also available online - www.vsppub.com Sensor system of a small biped entertainment robot Short paper TATSUZO

More information

Physical and Affective Interaction between Human and Mental Commit Robot

Physical and Affective Interaction between Human and Mental Commit Robot Proceedings of the 21 IEEE International Conference on Robotics & Automation Seoul, Korea May 21-26, 21 Physical and Affective Interaction between Human and Mental Commit Robot Takanori Shibata Kazuo Tanie

More information

ISMCR2004. Abstract. 2. The mechanism of the master-slave arm of Telesar II. 1. Introduction. D21-Page 1

ISMCR2004. Abstract. 2. The mechanism of the master-slave arm of Telesar II. 1. Introduction. D21-Page 1 Development of Multi-D.O.F. Master-Slave Arm with Bilateral Impedance Control for Telexistence Riichiro Tadakuma, Kiyohiro Sogen, Hiroyuki Kajimoto, Naoki Kawakami, and Susumu Tachi 7-3-1 Hongo, Bunkyo-ku,

More information

Intent Expression Using Eye Robot for Mascot Robot System

Intent Expression Using Eye Robot for Mascot Robot System Intent Expression Using Eye Robot for Mascot Robot System Yoichi Yamazaki, Fangyan Dong, Yuta Masuda, Yukiko Uehara, Petar Kormushev, Hai An Vu, Phuc Quang Le, and Kaoru Hirota Department of Computational

More information

MIN-Fakultät Fachbereich Informatik. Universität Hamburg. Socially interactive robots. Christine Upadek. 29 November Christine Upadek 1

MIN-Fakultät Fachbereich Informatik. Universität Hamburg. Socially interactive robots. Christine Upadek. 29 November Christine Upadek 1 Christine Upadek 29 November 2010 Christine Upadek 1 Outline Emotions Kismet - a sociable robot Outlook Christine Upadek 2 Denition Social robots are embodied agents that are part of a heterogeneous group:

More information

The Use of Social Robot Ono in Robot Assisted Therapy

The Use of Social Robot Ono in Robot Assisted Therapy The Use of Social Robot Ono in Robot Assisted Therapy Cesar Vandevelde 1, Jelle Saldien 1, Maria-Cristina Ciocci 1, Bram Vanderborght 2 1 Ghent University, Dept. of Industrial Systems and Product Design,

More information

ON THE DEVELOPMENT OF THE EMOTION EXPRESSION HUMANOID ROBOT WE-4RII WITH RCH-1

ON THE DEVELOPMENT OF THE EMOTION EXPRESSION HUMANOID ROBOT WE-4RII WITH RCH-1 ON THE DEVELOPMENT OF THE EMOTION EXPRESSION HUMANOID ROBOT WE-4RII WITH RCH-1 MASSIMILIANO ZECCA, RoboCasa, Waseda University, #55S-706A, 3-4-1 Okubo, Shinjuku-ku, Tokyo, 169-8555 Japan robocasa@list.waseda.jp

More information

MULTI-LAYERED HYBRID ARCHITECTURE TO SOLVE COMPLEX TASKS OF AN AUTONOMOUS MOBILE ROBOT

MULTI-LAYERED HYBRID ARCHITECTURE TO SOLVE COMPLEX TASKS OF AN AUTONOMOUS MOBILE ROBOT MULTI-LAYERED HYBRID ARCHITECTURE TO SOLVE COMPLEX TASKS OF AN AUTONOMOUS MOBILE ROBOT F. TIECHE, C. FACCHINETTI and H. HUGLI Institute of Microtechnology, University of Neuchâtel, Rue de Tivoli 28, CH-2003

More information

HAND-SHAPED INTERFACE FOR INTUITIVE HUMAN- ROBOT COMMUNICATION THROUGH HAPTIC MEDIA

HAND-SHAPED INTERFACE FOR INTUITIVE HUMAN- ROBOT COMMUNICATION THROUGH HAPTIC MEDIA HAND-SHAPED INTERFACE FOR INTUITIVE HUMAN- ROBOT COMMUNICATION THROUGH HAPTIC MEDIA RIKU HIKIJI AND SHUJI HASHIMOTO Department of Applied Physics, School of Science and Engineering, Waseda University 3-4-1

More information

Chapter 2 Introduction to Haptics 2.1 Definition of Haptics

Chapter 2 Introduction to Haptics 2.1 Definition of Haptics Chapter 2 Introduction to Haptics 2.1 Definition of Haptics The word haptic originates from the Greek verb hapto to touch and therefore refers to the ability to touch and manipulate objects. The haptic

More information

Rapid Development System for Humanoid Vision-based Behaviors with Real-Virtual Common Interface

Rapid Development System for Humanoid Vision-based Behaviors with Real-Virtual Common Interface Rapid Development System for Humanoid Vision-based Behaviors with Real-Virtual Common Interface Kei Okada 1, Yasuyuki Kino 1, Fumio Kanehiro 2, Yasuo Kuniyoshi 1, Masayuki Inaba 1, Hirochika Inoue 1 1

More information

Tele-Nursing System with Realistic Sensations using Virtual Locomotion Interface

Tele-Nursing System with Realistic Sensations using Virtual Locomotion Interface 6th ERCIM Workshop "User Interfaces for All" Tele-Nursing System with Realistic Sensations using Virtual Locomotion Interface Tsutomu MIYASATO ATR Media Integration & Communications 2-2-2 Hikaridai, Seika-cho,

More information

Design and Control of the BUAA Four-Fingered Hand

Design and Control of the BUAA Four-Fingered Hand Proceedings of the 2001 IEEE International Conference on Robotics & Automation Seoul, Korea May 21-26, 2001 Design and Control of the BUAA Four-Fingered Hand Y. Zhang, Z. Han, H. Zhang, X. Shang, T. Wang,

More information

Complex Continuous Meaningful Humanoid Interaction: A Multi Sensory-Cue Based Approach

Complex Continuous Meaningful Humanoid Interaction: A Multi Sensory-Cue Based Approach Complex Continuous Meaningful Humanoid Interaction: A Multi Sensory-Cue Based Approach Gordon Cheng Humanoid Interaction Laboratory Intelligent Systems Division Electrotechnical Laboratory Tsukuba, Ibaraki,

More information

Evaluation of Five-finger Haptic Communication with Network Delay

Evaluation of Five-finger Haptic Communication with Network Delay Tactile Communication Haptic Communication Network Delay Evaluation of Five-finger Haptic Communication with Network Delay To realize tactile communication, we clarify some issues regarding how delay affects

More information

A STUDY ON THE EMOTION ELICITING ALGORITHM AND FACIAL EXPRESSION FOR DESIGNING INTELLIGENT ROBOTS

A STUDY ON THE EMOTION ELICITING ALGORITHM AND FACIAL EXPRESSION FOR DESIGNING INTELLIGENT ROBOTS A STUDY ON THE EMOTION ELICITING ALGORITHM AND FACIAL EXPRESSION FOR DESIGNING INTELLIGENT ROBOTS Jeong-gun Choi, Kwang myung Oh, and Myung suk Kim Korea Advanced Institute of Science and Technology, Yu-seong-gu,

More information

2 Our Hardware Architecture

2 Our Hardware Architecture RoboCup-99 Team Descriptions Middle Robots League, Team NAIST, pages 170 174 http: /www.ep.liu.se/ea/cis/1999/006/27/ 170 Team Description of the RoboCup-NAIST NAIST Takayuki Nakamura, Kazunori Terada,

More information

Optic Flow Based Skill Learning for A Humanoid to Trap, Approach to, and Pass a Ball

Optic Flow Based Skill Learning for A Humanoid to Trap, Approach to, and Pass a Ball Optic Flow Based Skill Learning for A Humanoid to Trap, Approach to, and Pass a Ball Masaki Ogino 1, Masaaki Kikuchi 1, Jun ichiro Ooga 1, Masahiro Aono 1 and Minoru Asada 1,2 1 Dept. of Adaptive Machine

More information

KMUTT Kickers: Team Description Paper

KMUTT Kickers: Team Description Paper KMUTT Kickers: Team Description Paper Thavida Maneewarn, Xye, Korawit Kawinkhrue, Amnart Butsongka, Nattapong Kaewlek King Mongkut s University of Technology Thonburi, Institute of Field Robotics (FIBO)

More information

DEVELOPMENT OF AN ARTIFICIAL DYNAMIC FACE APPLIED TO AN AFFECTIVE ROBOT

DEVELOPMENT OF AN ARTIFICIAL DYNAMIC FACE APPLIED TO AN AFFECTIVE ROBOT DEVELOPMENT OF AN ARTIFICIAL DYNAMIC FACE APPLIED TO AN AFFECTIVE ROBOT ALVARO SANTOS 1, CHRISTIANE GOULART 2, VINÍCIUS BINOTTE 3, HAMILTON RIVERA 3, CARLOS VALADÃO 3, TEODIANO BASTOS 2, 3 1. Assistive

More information

Research Article Humanoid Robot Head Design Based on Uncanny Valley and FACS

Research Article Humanoid Robot Head Design Based on Uncanny Valley and FACS Robotics, Article ID 208924, 5 pages http://dx.doi.org/10.1155/2014/208924 Research Article Humanoid Robot Head Design Based on Uncanny Valley and FACS Jizheng Yan, 1 Zhiliang Wang, 2 and Yan Yan 2 1 SchoolofAutomationandElectricalEngineering,UniversityofScienceandTechnologyBeijing,Beijing100083,China

More information

Shuffle Traveling of Humanoid Robots

Shuffle Traveling of Humanoid Robots Shuffle Traveling of Humanoid Robots Masanao Koeda, Masayuki Ueno, and Takayuki Serizawa Abstract Recently, many researchers have been studying methods for the stepless slip motion of humanoid robots.

More information

Robot: icub This humanoid helps us study the brain

Robot: icub This humanoid helps us study the brain ProfileArticle Robot: icub This humanoid helps us study the brain For the complete profile with media resources, visit: http://education.nationalgeographic.org/news/robot-icub/ Program By Robohub Tuesday,

More information

Social Constraints on Animate Vision

Social Constraints on Animate Vision Social Constraints on Animate Vision Cynthia Breazeal, Aaron Edsinger, Paul Fitzpatrick, Brian Scassellati, Paulina Varchavskaia MIT Artificial Intelligence Laboratory 545 Technology Square Cambridge,

More information

Live Feeling on Movement of an Autonomous Robot Using a Biological Signal

Live Feeling on Movement of an Autonomous Robot Using a Biological Signal Live Feeling on Movement of an Autonomous Robot Using a Biological Signal Shigeru Sakurazawa, Keisuke Yanagihara, Yasuo Tsukahara, Hitoshi Matsubara Future University-Hakodate, System Information Science,

More information

Tactile Actuators Using SMA Micro-wires and the Generation of Texture Sensation from Images

Tactile Actuators Using SMA Micro-wires and the Generation of Texture Sensation from Images IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) November -,. Tokyo, Japan Tactile Actuators Using SMA Micro-wires and the Generation of Texture Sensation from Images Yuto Takeda

More information

Touch Perception and Emotional Appraisal for a Virtual Agent

Touch Perception and Emotional Appraisal for a Virtual Agent Touch Perception and Emotional Appraisal for a Virtual Agent Nhung Nguyen, Ipke Wachsmuth, Stefan Kopp Faculty of Technology University of Bielefeld 33594 Bielefeld Germany {nnguyen, ipke, skopp}@techfak.uni-bielefeld.de

More information

Psychology in Your Life

Psychology in Your Life Sarah Grison Todd Heatherton Michael Gazzaniga Psychology in Your Life FIRST EDITION Chapter 5 Sensation and Perception 2014 W. W. Norton & Company, Inc. Section 5.1 How Do Sensation and Perception Affect

More information

VIRTUAL FIGURE PRESENTATION USING PRESSURE- SLIPPAGE-GENERATION TACTILE MOUSE

VIRTUAL FIGURE PRESENTATION USING PRESSURE- SLIPPAGE-GENERATION TACTILE MOUSE VIRTUAL FIGURE PRESENTATION USING PRESSURE- SLIPPAGE-GENERATION TACTILE MOUSE Yiru Zhou 1, Xuecheng Yin 1, and Masahiro Ohka 1 1 Graduate School of Information Science, Nagoya University Email: ohka@is.nagoya-u.ac.jp

More information

Segmentation Extracting image-region with face

Segmentation Extracting image-region with face Facial Expression Recognition Using Thermal Image Processing and Neural Network Y. Yoshitomi 3,N.Miyawaki 3,S.Tomita 3 and S. Kimura 33 *:Department of Computer Science and Systems Engineering, Faculty

More information

Group Robots Forming a Mechanical Structure - Development of slide motion mechanism and estimation of energy consumption of the structural formation -

Group Robots Forming a Mechanical Structure - Development of slide motion mechanism and estimation of energy consumption of the structural formation - Proceedings 2003 IEEE International Symposium on Computational Intelligence in Robotics and Automation July 16-20, 2003, Kobe, Japan Group Robots Forming a Mechanical Structure - Development of slide motion

More information

Robot: Geminoid F This android robot looks just like a woman

Robot: Geminoid F This android robot looks just like a woman ProfileArticle Robot: Geminoid F This android robot looks just like a woman For the complete profile with media resources, visit: http://education.nationalgeographic.org/news/robot-geminoid-f/ Program

More information

Hanuman KMUTT: Team Description Paper

Hanuman KMUTT: Team Description Paper Hanuman KMUTT: Team Description Paper Wisanu Jutharee, Sathit Wanitchaikit, Boonlert Maneechai, Natthapong Kaewlek, Thanniti Khunnithiwarawat, Pongsakorn Polchankajorn, Nakarin Suppakun, Narongsak Tirasuntarakul,

More information

Dr. Ashish Dutta. Professor, Dept. of Mechanical Engineering Indian Institute of Technology Kanpur, INDIA

Dr. Ashish Dutta. Professor, Dept. of Mechanical Engineering Indian Institute of Technology Kanpur, INDIA Introduction: History of Robotics - past, present and future Dr. Ashish Dutta Professor, Dept. of Mechanical Engineering Indian Institute of Technology Kanpur, INDIA Origin of Automation: replacing human

More information

Contents. Mental Commit Robot (Mental Calming Robot) Industrial Robots. In What Way are These Robots Intelligent. Video: Mental Commit Robots

Contents. Mental Commit Robot (Mental Calming Robot) Industrial Robots. In What Way are These Robots Intelligent. Video: Mental Commit Robots Human Robot Interaction for Psychological Enrichment Dr. Takanori Shibata Senior Research Scientist Intelligent Systems Institute National Institute of Advanced Industrial Science and Technology (AIST)

More information

Mechatronics of the Humanoid Robot ROMAN

Mechatronics of the Humanoid Robot ROMAN Mechatronics of the Humanoid Robot ROMAN Krzysztof Mianowski 1 and Norbert Schmitz and Karsten Berns 2 1 Institute of Aeronautics and Applied Mechanics, Faculty of Power and Aeronautical Engineering, Warsaw

More information

THE DEVELOPMENT of domestic and service robots has

THE DEVELOPMENT of domestic and service robots has 1290 IEEE TRANSACTIONS ON CYBERNETICS, VOL. 43, NO. 4, AUGUST 2013 Robotic Emotional Expression Generation Based on Mood Transition and Personality Model Meng-Ju Han, Chia-How Lin, and Kai-Tai Song, Member,

More information

Active Agent Oriented Multimodal Interface System

Active Agent Oriented Multimodal Interface System Active Agent Oriented Multimodal Interface System Osamu HASEGAWA; Katsunobu ITOU, Takio KURITA, Satoru HAYAMIZU, Kazuyo TANAKA, Kazuhiko YAMAMOTO, and Nobuyuki OTSU Electrotechnical Laboratory 1-1-4 Umezono,

More information

FaceReader Methodology Note

FaceReader Methodology Note FaceReader Methodology Note By Dr. Leanne Loijens and Dr. Olga Krips Behavioral research consultants at Noldus Information Technology A white paper by Noldus Information Technology what is facereader?

More information

PRESENTED BY HUMANOID IIT KANPUR

PRESENTED BY HUMANOID IIT KANPUR SENSORS & ACTUATORS Robotics Club (Science and Technology Council, IITK) PRESENTED BY HUMANOID IIT KANPUR October 11th, 2017 WHAT ARE WE GOING TO LEARN!! COMPARISON between Transducers Sensors And Actuators.

More information

DEVELOPMENT OF A TELEOPERATION SYSTEM AND AN OPERATION ASSIST USER INTERFACE FOR A HUMANOID ROBOT

DEVELOPMENT OF A TELEOPERATION SYSTEM AND AN OPERATION ASSIST USER INTERFACE FOR A HUMANOID ROBOT DEVELOPMENT OF A TELEOPERATION SYSTEM AND AN OPERATION ASSIST USER INTERFACE FOR A HUMANOID ROBOT Shin-ichiro Kaneko, Yasuo Nasu, Shungo Usui, Mitsuhiro Yamano, Kazuhisa Mitobe Yamagata University, Jonan

More information

REBO: A LIFE-LIKE UNIVERSAL REMOTE CONTROL

REBO: A LIFE-LIKE UNIVERSAL REMOTE CONTROL World Automation Congress 2010 TSI Press. REBO: A LIFE-LIKE UNIVERSAL REMOTE CONTROL SEIJI YAMADA *1 AND KAZUKI KOBAYASHI *2 *1 National Institute of Informatics / The Graduate University for Advanced

More information

Android (Child android)

Android (Child android) Social and ethical issue Why have I developed the android? Hiroshi ISHIGURO Department of Adaptive Machine Systems, Osaka University ATR Intelligent Robotics and Communications Laboratories JST ERATO Asada

More information

Korea Humanoid Robot Projects

Korea Humanoid Robot Projects Korea Humanoid Robot Projects Jun Ho Oh HUBO Lab., KAIST KOREA Humanoid Projects(~2001) A few humanoid robot projects were existed. Most researches were on dynamic and kinematic simulations for walking

More information

Perception. Read: AIMA Chapter 24 & Chapter HW#8 due today. Vision

Perception. Read: AIMA Chapter 24 & Chapter HW#8 due today. Vision 11-25-2013 Perception Vision Read: AIMA Chapter 24 & Chapter 25.3 HW#8 due today visual aural haptic & tactile vestibular (balance: equilibrium, acceleration, and orientation wrt gravity) olfactory taste

More information

RB-Ais-01. Aisoy1 Programmable Interactive Robotic Companion. Renewed and funny dialogs

RB-Ais-01. Aisoy1 Programmable Interactive Robotic Companion. Renewed and funny dialogs RB-Ais-01 Aisoy1 Programmable Interactive Robotic Companion Renewed and funny dialogs Aisoy1 II s behavior has evolved to a more proactive interaction. It has refined its sense of humor and tries to express

More information

Detection of external stimuli Response to the stimuli Transmission of the response to the brain

Detection of external stimuli Response to the stimuli Transmission of the response to the brain Sensation Detection of external stimuli Response to the stimuli Transmission of the response to the brain Perception Processing, organizing and interpreting sensory signals Internal representation of the

More information

1. INTRODUCTION: 2. EOG: system, handicapped people, wheelchair.

1. INTRODUCTION: 2. EOG: system, handicapped people, wheelchair. ABSTRACT This paper presents a new method to control and guide mobile robots. In this case, to send different commands we have used electrooculography (EOG) techniques, so that, control is made by means

More information

ZJUDancer Team Description Paper Humanoid Kid-Size League of Robocup 2015

ZJUDancer Team Description Paper Humanoid Kid-Size League of Robocup 2015 ZJUDancer Team Description Paper Humanoid Kid-Size League of Robocup 2015 Yu DongDong, Liu Yun, Zhou Chunlin, and Xiong Rong State Key Lab. of Industrial Control Technology, Zhejiang University, Hangzhou,

More information

Team KMUTT: Team Description Paper

Team KMUTT: Team Description Paper Team KMUTT: Team Description Paper Thavida Maneewarn, Xye, Pasan Kulvanit, Sathit Wanitchaikit, Panuvat Sinsaranon, Kawroong Saktaweekulkit, Nattapong Kaewlek Djitt Laowattana King Mongkut s University

More information

Non Verbal Communication of Emotions in Social Robots

Non Verbal Communication of Emotions in Social Robots Non Verbal Communication of Emotions in Social Robots Aryel Beck Supervisor: Prof. Nadia Thalmann BeingThere Centre, Institute for Media Innovation, Nanyang Technological University, Singapore INTRODUCTION

More information

QUTIE TOWARD A MULTI-FUNCTIONAL ROBOTIC PLATFORM

QUTIE TOWARD A MULTI-FUNCTIONAL ROBOTIC PLATFORM QUTIE TOWARD A MULTI-FUNCTIONAL ROBOTIC PLATFORM Matti Tikanmäki, Antti Tikanmäki, Juha Röning. University of Oulu, Computer Engineering Laboratory, Intelligent Systems Group ABSTRACT In this paper we

More information

WIRELESS VOICE CONTROLLED ROBOTICS ARM

WIRELESS VOICE CONTROLLED ROBOTICS ARM WIRELESS VOICE CONTROLLED ROBOTICS ARM 1 R.ASWINBALAJI, 2 A.ARUNRAJA 1 BE ECE,SRI RAMAKRISHNA ENGINEERING COLLEGE,COIMBATORE,INDIA 2 ME EST,SRI RAMAKRISHNA ENGINEERING COLLEGE,COIMBATORE,INDIA aswinbalaji94@gmail.com

More information

Henry Lin, Department of Electrical and Computer Engineering, California State University, Bakersfield Lecture 8 (Robotics) July 25 th, 2012

Henry Lin, Department of Electrical and Computer Engineering, California State University, Bakersfield Lecture 8 (Robotics) July 25 th, 2012 Henry Lin, Department of Electrical and Computer Engineering, California State University, Bakersfield Lecture 8 (Robotics) July 25 th, 2012 1 2 Robotic Applications in Smart Homes Control of the physical

More information

Sensation and Perception. What We Will Cover in This Section. Sensation

Sensation and Perception. What We Will Cover in This Section. Sensation Sensation and Perception Dr. Dennis C. Sweeney 2/18/2009 Sensation.ppt 1 What We Will Cover in This Section Overview Psychophysics Sensations Hearing Vision Touch Taste Smell Kinesthetic Perception 2/18/2009

More information

Distributed Vision System: A Perceptual Information Infrastructure for Robot Navigation

Distributed Vision System: A Perceptual Information Infrastructure for Robot Navigation Distributed Vision System: A Perceptual Information Infrastructure for Robot Navigation Hiroshi Ishiguro Department of Information Science, Kyoto University Sakyo-ku, Kyoto 606-01, Japan E-mail: ishiguro@kuis.kyoto-u.ac.jp

More information

Cooperative Transportation by Humanoid Robots Learning to Correct Positioning

Cooperative Transportation by Humanoid Robots Learning to Correct Positioning Cooperative Transportation by Humanoid Robots Learning to Correct Positioning Yutaka Inoue, Takahiro Tohge, Hitoshi Iba Department of Frontier Informatics, Graduate School of Frontier Sciences, The University

More information

ZJUDancer Team Description Paper Humanoid Kid-Size League of Robocup 2014

ZJUDancer Team Description Paper Humanoid Kid-Size League of Robocup 2014 ZJUDancer Team Description Paper Humanoid Kid-Size League of Robocup 2014 Yu DongDong, Xiang Chuan, Zhou Chunlin, and Xiong Rong State Key Lab. of Industrial Control Technology, Zhejiang University, Hangzhou,

More information

Emotional Robotics: Tug of War

Emotional Robotics: Tug of War Emotional Robotics: Tug of War David Grant Cooper DCOOPER@CS.UMASS.EDU Dov Katz DUBIK@CS.UMASS.EDU Hava T. Siegelmann HAVA@CS.UMASS.EDU Computer Science Building, 140 Governors Drive, University of Massachusetts,

More information

SitiK KIT. Team Description for the Humanoid KidSize League of RoboCup 2010

SitiK KIT. Team Description for the Humanoid KidSize League of RoboCup 2010 SitiK KIT Team Description for the Humanoid KidSize League of RoboCup 2010 Shohei Takesako, Nasuka Awai, Kei Sugawara, Hideo Hattori, Yuichiro Hirai, Takesi Miyata, Keisuke Urushibata, Tomoya Oniyama,

More information

Behaviour-Based Control. IAR Lecture 5 Barbara Webb

Behaviour-Based Control. IAR Lecture 5 Barbara Webb Behaviour-Based Control IAR Lecture 5 Barbara Webb Traditional sense-plan-act approach suggests a vertical (serial) task decomposition Sensors Actuators perception modelling planning task execution motor

More information

Moving Obstacle Avoidance for Mobile Robot Moving on Designated Path

Moving Obstacle Avoidance for Mobile Robot Moving on Designated Path Moving Obstacle Avoidance for Mobile Robot Moving on Designated Path Taichi Yamada 1, Yeow Li Sa 1 and Akihisa Ohya 1 1 Graduate School of Systems and Information Engineering, University of Tsukuba, 1-1-1,

More information

HRP-2W: A Humanoid Platform for Research on Support Behavior in Daily life Environments

HRP-2W: A Humanoid Platform for Research on Support Behavior in Daily life Environments Book Title Book Editors IOS Press, 2003 1 HRP-2W: A Humanoid Platform for Research on Support Behavior in Daily life Environments Tetsunari Inamura a,1, Masayuki Inaba a and Hirochika Inoue a a Dept. of

More information

Team Description Paper: HuroEvolution Humanoid Robot for Robocup 2014 Humanoid League

Team Description Paper: HuroEvolution Humanoid Robot for Robocup 2014 Humanoid League Team Description Paper: HuroEvolution Humanoid Robot for Robocup 2014 Humanoid League Chung-Hsien Kuo, Yu-Cheng Kuo, Yu-Ping Shen, Chen-Yun Kuo, Yi-Tseng Lin 1 Department of Electrical Egineering, National

More information

Crossmodal Attention & Multisensory Integration: Implications for Multimodal Interface Design. In the Realm of the Senses

Crossmodal Attention & Multisensory Integration: Implications for Multimodal Interface Design. In the Realm of the Senses Crossmodal Attention & Multisensory Integration: Implications for Multimodal Interface Design Charles Spence Department of Experimental Psychology, Oxford University In the Realm of the Senses Wickens

More information

Fibratus tactile sensor using reflection image

Fibratus tactile sensor using reflection image Fibratus tactile sensor using reflection image The requirements of fibratus tactile sensor Satoshi Saga Tohoku University Shinobu Kuroki Univ. of Tokyo Susumu Tachi Univ. of Tokyo Abstract In recent years,

More information

the human chapter 1 Traffic lights the human User-centred Design Light Vision part 1 (modified extract for AISD 2005) Information i/o

the human chapter 1 Traffic lights the human User-centred Design Light Vision part 1 (modified extract for AISD 2005) Information i/o Traffic lights chapter 1 the human part 1 (modified extract for AISD 2005) http://www.baddesigns.com/manylts.html User-centred Design Bad design contradicts facts pertaining to human capabilities Usability

More information

Converting Motion between Different Types of Humanoid Robots Using Genetic Algorithms

Converting Motion between Different Types of Humanoid Robots Using Genetic Algorithms Converting Motion between Different Types of Humanoid Robots Using Genetic Algorithms Mari Nishiyama and Hitoshi Iba Abstract The imitation between different types of robots remains an unsolved task for

More information

Experiments of Vision Guided Walking of Humanoid Robot, KHR-2

Experiments of Vision Guided Walking of Humanoid Robot, KHR-2 Proceedings of 2005 5th IEEE-RAS International Conference on Humanoid Robots Experiments of Vision Guided Walking of Humanoid Robot, KHR-2 Jung-Yup Kim, Ill-Woo Park, Jungho Lee and Jun-Ho Oh HUBO Laboratory,

More information

ZJUDancer Team Description Paper

ZJUDancer Team Description Paper ZJUDancer Team Description Paper Tang Qing, Xiong Rong, Li Shen, Zhan Jianbo, and Feng Hao State Key Lab. of Industrial Technology, Zhejiang University, Hangzhou, China Abstract. This document describes

More information

Emotional BWI Segway Robot

Emotional BWI Segway Robot Emotional BWI Segway Robot Sangjin Shin https:// github.com/sangjinshin/emotional-bwi-segbot 1. Abstract The Building-Wide Intelligence Project s Segway Robot lacked emotions and personality critical in

More information

Design of Joint Controller Circuit for PA10 Robot Arm

Design of Joint Controller Circuit for PA10 Robot Arm Design of Joint Controller Circuit for PA10 Robot Arm Sereiratha Phal and Manop Wongsaisuwan Department of Electrical Engineering, Faculty of Engineering, Chulalongkorn University, Bangkok, 10330, Thailand.

More information

Interaction rule learning with a human partner based on an imitation faculty with a simple visuo-motor mapping

Interaction rule learning with a human partner based on an imitation faculty with a simple visuo-motor mapping Robotics and Autonomous Systems 54 (2006) 414 418 www.elsevier.com/locate/robot Interaction rule learning with a human partner based on an imitation faculty with a simple visuo-motor mapping Masaki Ogino

More information

Song Shuffler Based on Automatic Human Emotion Recognition

Song Shuffler Based on Automatic Human Emotion Recognition Recent Advances in Technology and Engineering (RATE-2017) 6 th National Conference by TJIT, Bangalore International Journal of Science, Engineering and Technology An Open Access Journal Song Shuffler Based

More information

Experimental Investigation into Influence of Negative Attitudes toward Robots on Human Robot Interaction

Experimental Investigation into Influence of Negative Attitudes toward Robots on Human Robot Interaction Experimental Investigation into Influence of Negative Attitudes toward Robots on Human Robot Interaction Tatsuya Nomura 1,2 1 Department of Media Informatics, Ryukoku University 1 5, Yokotani, Setaohe

More information

Ono, a DIY Open Source Platform for Social Robotics

Ono, a DIY Open Source Platform for Social Robotics Ono, a DIY Open Source Platform for Social Robotics Cesar Vandevelde Dept. of Industrial System & Product Design Ghent University Marksesteenweg 58 Kortrijk, Belgium cesar.vandevelde@ugent.be Jelle Saldien

More information

RoboCup TDP Team ZSTT

RoboCup TDP Team ZSTT RoboCup 2018 - TDP Team ZSTT Jaesik Jeong 1, Jeehyun Yang 1, Yougsup Oh 2, Hyunah Kim 2, Amirali Setaieshi 3, Sourosh Sedeghnejad 3, and Jacky Baltes 1 1 Educational Robotics Centre, National Taiwan Noremal

More information

Motion Control of a Three Active Wheeled Mobile Robot and Collision-Free Human Following Navigation in Outdoor Environment

Motion Control of a Three Active Wheeled Mobile Robot and Collision-Free Human Following Navigation in Outdoor Environment Proceedings of the International MultiConference of Engineers and Computer Scientists 2016 Vol I,, March 16-18, 2016, Hong Kong Motion Control of a Three Active Wheeled Mobile Robot and Collision-Free

More information

Vision-Guided Motion. Presented by Tom Gray

Vision-Guided Motion. Presented by Tom Gray Vision-Guided Motion Presented by Tom Gray Overview Part I Machine Vision Hardware Part II Machine Vision Software Part II Motion Control Part IV Vision-Guided Motion The Result Harley Davidson Example

More information

Remote Control Based Hybrid-Structure Robot Design for Home Security Applications

Remote Control Based Hybrid-Structure Robot Design for Home Security Applications Proceedings of the 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems October 9-15, 2006, Beijing, China Remote Control Based Hybrid-Structure Robot Design for Home Security Applications

More information

AUTOMATIC EYE DETECTION IN FACIAL IMAGES WITH UNCONSTRAINED BACKGROUNDS

AUTOMATIC EYE DETECTION IN FACIAL IMAGES WITH UNCONSTRAINED BACKGROUNDS AUTOMATIC EYE DETECTION IN FACIAL IMAGES WITH UNCONSTRAINED BACKGROUNDS Dr John Cowell Dept. of Computer Science, De Montfort University, The Gateway, Leicester, LE1 9BH England, jcowell@dmu.ac.uk ABSTRACT

More information

Space Research expeditions and open space work. Education & Research Teaching and laboratory facilities. Medical Assistance for people

Space Research expeditions and open space work. Education & Research Teaching and laboratory facilities. Medical Assistance for people Space Research expeditions and open space work Education & Research Teaching and laboratory facilities. Medical Assistance for people Safety Life saving activity, guarding Military Use to execute missions

More information

Understanding the Mechanism of Sonzai-Kan

Understanding the Mechanism of Sonzai-Kan Understanding the Mechanism of Sonzai-Kan ATR Intelligent Robotics and Communication Laboratories Where does the Sonzai-Kan, the feeling of one's presence, such as the atmosphere, the authority, come from?

More information

Device Distributed Approach to Expandable Robot System Using Intelligent Device with Super-Microprocessor

Device Distributed Approach to Expandable Robot System Using Intelligent Device with Super-Microprocessor Paper: Device Distributed Approach to Expandable Robot System Using Intelligent Device with Super-Microprocessor Kei Okada *, Akira Fuyuno *, Takeshi Morishita *,**, Takashi Ogura *, Yasumoto Ohkubo *,

More information

Emily Dobson, Sydney Reed, Steve Smoak

Emily Dobson, Sydney Reed, Steve Smoak Emily Dobson, Sydney Reed, Steve Smoak A computer that has the ability to perform the same tasks as an intelligent being Reason Learn from past experience Make generalizations Discover meaning 1 1 1950-

More information

EDUCATION ACADEMIC DEGREE

EDUCATION ACADEMIC DEGREE Akihiko YAMAGUCHI Address: Nara Institute of Science and Technology, 8916-5, Takayama-cho, Ikoma-shi, Nara, JAPAN 630-0192 Phone: +81-(0)743-72-5376 E-mail: akihiko-y@is.naist.jp EDUCATION 2002.4.1-2006.3.24:

More information

MOBILE AND UBIQUITOUS HAPTICS

MOBILE AND UBIQUITOUS HAPTICS MOBILE AND UBIQUITOUS HAPTICS Jussi Rantala and Jukka Raisamo Tampere Unit for Computer-Human Interaction School of Information Sciences University of Tampere, Finland Contents Haptic communication Affective

More information

3D Face Recognition in Biometrics

3D Face Recognition in Biometrics 3D Face Recognition in Biometrics CHAO LI, ARMANDO BARRETO Electrical & Computer Engineering Department Florida International University 10555 West Flagler ST. EAS 3970 33174 USA {cli007, barretoa}@fiu.edu

More information

2B34 DEVELOPMENT OF A HYDRAULIC PARALLEL LINK TYPE OF FORCE DISPLAY

2B34 DEVELOPMENT OF A HYDRAULIC PARALLEL LINK TYPE OF FORCE DISPLAY 2B34 DEVELOPMENT OF A HYDRAULIC PARALLEL LINK TYPE OF FORCE DISPLAY -Improvement of Manipulability Using Disturbance Observer and its Application to a Master-slave System- Shigeki KUDOMI*, Hironao YAMADA**

More information

Texture recognition using force sensitive resistors

Texture recognition using force sensitive resistors Texture recognition using force sensitive resistors SAYED, Muhammad, DIAZ GARCIA,, Jose Carlos and ALBOUL, Lyuba Available from Sheffield Hallam University Research

More information

KINECT CONTROLLED HUMANOID AND HELICOPTER

KINECT CONTROLLED HUMANOID AND HELICOPTER KINECT CONTROLLED HUMANOID AND HELICOPTER Muffakham Jah College of Engineering & Technology Presented by : MOHAMMED KHAJA ILIAS PASHA ZESHAN ABDUL MAJEED AZMI SYED ABRAR MOHAMMED ISHRAQ SARID MOHAMMED

More information

Cognitive Media Processing

Cognitive Media Processing Cognitive Media Processing 2011-11-01 Nobuaki Minematsu Menu of the last lecture Interaction and multimedia User-friendliness and reality Role of multimedia interface Direct interface and indirect (agent)

More information

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright E90 Project Proposal 6 December 2006 Paul Azunre Thomas Murray David Wright Table of Contents Abstract 3 Introduction..4 Technical Discussion...4 Tracking Input..4 Haptic Feedack.6 Project Implementation....7

More information

Constructivist Approach to Human-Robot Emotional Communication - Design of Evolutionary Function for WAMOEBA-3 -

Constructivist Approach to Human-Robot Emotional Communication - Design of Evolutionary Function for WAMOEBA-3 - Constructivist Approach to Human-Robot Emotional Communication - Design of Evolutionary Function for WAMOEBA-3 - Yuki SUGA, Hiroaki ARIE,Tetsuya OGATA, and Shigeki SUGANO Humanoid Robotics Institute (HRI),

More information