Imitation based Human-Robot Interaction -Roles of Joint Attention and Motion Prediction-

Size: px
Start display at page:

Download "Imitation based Human-Robot Interaction -Roles of Joint Attention and Motion Prediction-"

Transcription

1 Proceedings of the 2004 IEEE International Workshop on Robot and Human Interactive Communication Kurashiki, Okayama Japan September 20-22,2004 Imitation based Human-Robot Interaction -Roles of Joint Attention and Motion Prediction- Yusuke AKIWA,* Yuki SUGA,* Tetsuya OGATA** and Shigeki SUGANO* * Humanoid Robotics Instrument (HRI), Waseda University Ookubo, Shinjyuku, Tokyo, , Japan ** Graduate School of Informatics, Kyoto University Yoshida-honmachi, Sakyo-ku, Kyoto, , Japan y. akiwa.623qruri.waseda. jp, ysuga@suou. waseda. jp, ogata@i.kyoto-u. ac. jp, sugano@paradise.mech. waseda. ac. j p Abstract Behavior imitation is crucial for the acquisition of intelligence as well as in communication. This paper describes two kinds of experiments of human-robot communication based on behavior imitation. One compared results obtained when the robot did and did not predict the experimental subject s behaviors by using past datasets, and the other compared results obtained with and without target objects in the simulator environment. The result of former experiment showed that the prediction of thle subject s behaviors increase the subject s interest. The result of the latter experiment confirmed that the presence of objects facilitates joint attention and make human-robot communication possible even when the robot uses a simple imitation mechanism. This result shows that in human-robot communication, human not only recognizes the behaviors of the robot passively but also adapts to the situation actively. In conclusion, it is confirmed that motion prediction an,d the presence of objects for joint attention are important for human-robot communication. 1 Introduction The serious problem posed by Japan s decreasing birthrate and aging population is expected to be solved by using robot technology in the care of the elderly. Since this will require robots to interact with human beings, they should be able to recognize expressions of emotion and be able to communicate nonverbally its well as verbally. Many researchers have been investigating these kinds of abilities (e.g.., multimodal conversation [I][2] [3], nodding robots [4] [5] [6] [7], and physical interaction with human beings [8] [9]). We focus on the face-to-face communication between a human being and a robot. We pay particular attention to imitation, which is thought to be one of the most crucial functions for the capability of communication. For example, it is well known that young children imitate their mother s behaviors, and that this helps them to learn to control their motions. M. Kawato has worked on easier ways to program the behavior of humanoid robots, and one of those ways is by having watching robots learn human behaviors [12]. Y. Kuniyosh has tried to reveal the structure of imitation and the mechanism of imitation development [13] [14]. And H. Kozima has investigated the mechanism and development of communication ability.they were developed a robot Infanoid that has pre-verbal communication devices. They have tried to give it attention-sharing ability, in the sense of monitoring a human being s attention [15] [16] [17]. These researchers have explored the nature and development of imitation and shared attention. They have made robots that can imitate and can discern where a person s attention is directed. However they have not directly addressed the problem of communication. We therefore investigated the communicative aspect of the imitation and tried to develop a human-robot communication system. We applied the concepts of shared attention and motion pre /04/$ IEEE

2 diction into our system. To consider the function of imitation in communication, these are thought to be key concepts as mentioned in next chapter. This paper is organized as follows. Section 2 describes the key concepts of our approach to imitation based communication, Section 3 describes our experimental system and methods, Section 4 describes and discusses our experimental results, and Section 5 concludes the paper with a brief summary and our plans for future work. 2 Our Approach 2.1 Motion prediction It is easily guessed that the robot which just imitates human motion makes human bored. Because it cannot prevent the subject from predicting the robot s behavior. In human-human communication, on the other hand, people infer each other s internal state. This difficult makes the communication interesting. This inferential aspect must be applied for humanrobot communication. We implemented a simple prediction function to the imitation system based on difference approximation. We accumulated two subject s hands positions. The imitation system calculates the next coordinates of the experimental subject s hands by using the differences between their previous positions. This function not only makes the robot s behavior smooth and natural, but also makes causes observers surprise, because a wrong prediction makes the robot s motion very different from human motion. This makes it hard for the person to predict the robot s behavior, and the communication will be interesting. 2.2 Joint Attention Joint attention is an early developing socialcommunicative skill in which two people (usually a young child and an adult) use gestures and gaze to share attention with respect to interesting objects or events. The infant turns their eyes to the object that the parents turn their eyes to and the infant pays attention to that object. After a while, the infant starts to pay attention to something actively. The infant thereby learns how to communicate with human beings through the recognition shared with his/her mother. Preverbal communication-that is, an infant s communication using nonverbal channels (e.g., joint attention and facial expressions)-thus plays an indispensable role in the development of human communication. We therefore implemented the concept of joint attention in our imitation system so that we could explore its role in communication. In this system, when the experimental subject interacts with the virtual robot on a simulator, we place the target objects in the real world and the virtual space. These target objects make it easy for the experimental subject to understand the robot s behavior. It is said that inferring partner s internal state is crucial for communication in theory of mind. 3 Experimental System and Method 3.1 Platform System Our platform system uses visual-audio information. An experimental subject is equipped with three markers on his/her head and hands. Using two images captured by two cameras, the system calculates the three-dimensional coordinates of the markers by using an image processor based on the trigonometric algorithms. The audio information used in the system includes the pace, volume, and speed data. The system also recognizes words spoken by an experimental subject. The system output is the robot s voice and behavior in a virtual space. The virtual robot has two arms (three degrees of freedom (DOF) in each shoulder and one DOF in each elbow) and a head (with two DOF), and it has enough joints to make simple gestures. The experimental subjects sat in front of the display and interacted with the virtual robot shown on the display. The experimental environment is shown in Figure 1. Figure 1: experimental environment Part (a) shows the distance between the subject and the display, part (b) shows a camera view, and

3 part (c) shows the displayed robot. 3.2 Motion prediction We evaluated the roles of motion prediction in communication by comparing the results of an experiments in which the experimental subjects ((fifteen students) tried, using only gestures, to teach a behavior to a robot with and without the motion-prediction function. The motion prediction function diagram is shown in Figure 2. In this experiment, we didn t use the voice information. In every processing step, with the image processor (IP7000, HICOS Inc.), the system calculated the 3-D positions based on the trigonometric algorithms, and also calculated the attention direction of subject s head based on the difference between the sizes of two markers on subject s head. Then the system ordered the virtual robot in the simulator to reach its arm to the calculated positions, and ordered it to move its head. In the frame interval (50 ms), the prediction function calculates the next position of the subject s hands based on the difference approximation. -- Head direction Was the robot friendly? Do you feel sense of unity? Do you want to interact with the robot actively? Was your behavior and robot s behavior similar? Are you interested in interaction with robot? Could you interact with robot? Was the robot actively? Did the robot behave variously? Did the robot behave rapidly (with your behavior)? Did you feel that the robot s behavior was flabbergasting? Did you feel that the robot interact with you? Was the robot s behavior speedy? Did you feel that the robot was living being or machinery? Did you feel boring? Is the experiment simple or complexity? Figure 3: Questionnaire items change of it. The delimited behaviors are named Behavior unit. The behavior units include movement data, voice data and the three-dimensional positions of the subject s hands. From the accumulated behavior units, the system adopts one which has the closest position data from current positions of the subject s hands, and outputs it. input output Figure 2: Motion Prediction Function Diagram After the experiment, we used a questionnaire survey to evaluate the impressions of the experimental subjects. The questionnaire asked twenty questions such as Figure Joint attention The system algorithm is shown in Figure 4. The system recognizes the word, and calculates the joint angles of the subject s arms and neck, using the algorithm described above. Next, the system integrates the voice information (the recognized word, volume and pace) and the visual information (the joint angles). In the integration section, the system delimits the subject s behavior based on the magnitude of the We evaluated the roles of joint attention to communication by having the experimental subjects (fifteen students) try, using both voice and gestures, to teach a robot with a joint-attention system, and comparing the results obtained with and without target objects

4 (a soccer ball and a die (see Figure 5)) in the experimental space. We simplified the problem by not the taking effects of the robot s eyes in the joint-attention system I I Figure 5: Target objects After the experiment, a questiomaire was taken survey to evaluate the experimental subject s impressions of the robot s behaviors. Tlhe questionnaire asked ten questions such as Figure 3. 4 Experimental Result and Discussion. 4.1 Experiment 1 (Prediction Function) Graphs comparing the predicted coordinates of the hands with the actual coordinates of the experimental subject s hands (acquired from the cameras) are shown in Figure 6. We defined the prediction rate P its follows: p 2 N N is the total number of predictbons and S is the number of correct predictions. When we defined the threshold range as of 50 mm (half the width of a human palm) because such differences ;are enough small not to influence people s impression,s, the prediction rates of the X, Y, and Z coordinates were respectively 60.4%, 75.6%, and 39.6%. In the result of the principal component analysis of the questionnaire, Interest in the robot was obtained in the first principal component. And Mind like intention was obtained in the second principal component. And in the principal component, we examined whether we use the motion prediction or not. For each principal component, we detected a significant difference between the cases in which motion prediction was used and the cases in which it was not used (Figure 7). When we did not use the motion-prediction function, the robot on the simulator simply imitated the Time[sl - Actual coordinate * Prediction coordinate Figure 6: Prediction Coordinate and Actual Coordinate 1 1,- \, I \ Interest Mind ;keh$ntion in the robot P=O.O15 :Simple imitation :Imitation with 0- motion prediction P< U Figure 7: Prediction Function Result experimental subject s behavior. That is, it simply imitated passively. As time went by, the experimental subject became bored and eventually lost interest in the robot. When we used the motion-prediction function and the function predicted the correct motion, the robot predicted the experimental subject s behavior and the robot s behavior preceded the subject s behavior. The experimental subjects then had the impression that the robot was imitating them just as another person would. The motion prediction was sometimes wrong, and such failures made the robot s behavior different from the subject s behavior and sometimes surprised the subject. This made the robot s behavior interesting to the experimental subjects. Consequently with the

5 prediction function, we could improve our imitation system. In addition, our experimental results suggested that the motion prediction plays a big part in communication. 4.2 Experiment 2 (Joint Attention) In a result of the principal component analysis of the questionnaire, the communication ability was obtained in the first principal component and the sense of unity with the robot was obtained as the second principal component. For each principal component we detected a significant difference depending on -- the presence or absence of the target objects (Figure 8). Communication ability I P= Ii Sense of unity I with robot P=O.O21 I-I] mss :With target objects :Without target object P<0.05 Figure 8: Joint Attention Result If the robot points of the target object, the experimental subject can guess that the robot recognized it. Since the experimental subject can regard the robot s behavior as a response to his/her teaching, the existence of a target object makes it easy to make sense. So the experimental subject can guess the robot s behavior. When there are no target objects, for the experimental subject robot s behavior is only information to guess what the robot thought. So the experimental subject was bored with the robot s behavior. Consequently the existence of the target objects suggests,joint attention, and helps the experimental subject make better guesses about the robot s behavior. The existence of the target objects helps give a sense of unity and a feeling of empathy. The sense of unity can be considered to enhance of the communication ability of the robot. The existence of target objects makes the experimental subject to feel as if the robot has mind like a person s mind. We confirmed that joint attention is necessary for guessing a partner s internal state from his/her behavior. This fact also can be explained from the aspect of theory of mind. Joint attention can play a role in thinking about a partner s position and in thinking from the partner s point of view. 5 Conclusion and Future Works. In this work we paid particular attention to imitation because it is important for the acquisition of intelligence and in communication. We showed the results of two experiments using an imitation communication system. In one experiment, the robot predicted the experimental subject s behaviors by using past datasets. In the other, focusing on the joint attention that is an aspect of communication, we compared between what happened when there were target objects in the simulator environment and what happened when there were not. The results of the former experiment showed that the prediction of personfs behaviors evoked that person s interest. This result shows that in human-robot communication, the human being not only recognizes the behaviors of robots passively but also adapts to the situation actively. This also implies that the human-robot communication is similar to human-human communication by introducing the prediction system. The latter experiment confirmed that the presence of objects facilitates joint attention and can make human-robot communication possible even when the robot uses a simple imitation mechanism. In conclusion, it is confirmed that motion prediction and the presence of object for joint attention are important for human-robot communication. In the future, we will experiment with the system that combines motion prediction and the jointattention system. We will also investigate the relation between the prediction rate and the robot s impression, how much the prediction rate makes an impression on the experimental subject. In addition, we will investigate various ways of motion prediction, the relation between the way of the motion prediction and the prediction rate, the influence of the experimental subject s gender, and the influence of the experimental subject s age. Acknowledgment This research was supported in part by a Grant-in-Aid for the WABOT-HOUSE Project by Gifu Prefecture. References [l] Yosuke Matsusaka, Shinya Fujie; Tetsunori Kobayashi, Modeling of conversational strategy for the robot participating in the group conversation, ISCA Proc. Interspeech 2001, pp : Sept

6 [2] S.Hashimoto,et al. Humanoid Robot-Development of information Assistant Robot Hadaly-,, 6th IEEE International Workshop on Robot and Communication, [3] S.Hashimoto,et al. Humanoid Robots in Waseda University - Haddaly2 and WABIAN-, Autonomous Robots, Vol. 12, No. 1, pp , Jan [4] S.Kawato and J.Ohya, Real-time detection of nodding and head-shaking by directly detecting and tracking the between-eyes, in Proceeding of Forth IEEE international conference on automatic face and gesture recognition, pp , [5] T.Watanabe, M.Okubo, M.Inado.me, Respiratory Entrainment in Face-to-Face Communication, 13th Symposium on Human Interface, oct.21-23, 1997 Osaka. [6] T.Watanabe, E-C0SMIC:Emboclied Communication System for Mind Connection, The Institute oj Electronics, Information and Communication Begineem? Technical Report of IEICE, HCS ( ) [7] Y.Ishii, T.Watanabe Evaluation of the Arrangement of VirtualActors in Remote Communication by Using the Embodied Virtual Communication System, Human Interface Society, Vol. 4, No. 2, pp , [S] T.Tojo, Y.Matsusaka, T.Ishii,T.Kobayashi, A conversational robot utilizing facial and body expressions, in Proceddings of 2000 IEEE SMC2000, Vol. 2, pp , [9] Takayuki Kanda, Hirishi Ishiguro, Tetsuo Ono, Michita Imai, Ryohei Nakatsu, An evaluatuion on interaction between humans and an autonomous robot Robovie, The Robotics Society of.japan, Vol. 20 No. 3, pp , [ 101 Hiroshi Kobayashi, Fumio Hara,.Facial interaction between anmated 3d face robot and human beings, in Proceedings of 1997 IEEE SMC97, vo1.4, pp , [ll] H.Miwa, T.Okuchi, H.Takanobu, A.Takanishi, Development of a New Human-like Head Robot WE-4, IROS2002, Vol., pp , [12] Schaal S. Is imitation learning the route to humanoid robots?, Trends in Cognitive Science 3 (6): pp , [13] S.Kagami, J.J.Kuffner, K.Nishiwaki, Y. Kuniyoshi, K.Okada, Mhaba, H.Inoue, Hu:manoid Arm Motion Planning Using Stereo Vision and RRT Search, Journal of Robotics and Mechatronics, Vol. 15, No. 2, pp , [14] K.Okada, Y.Suzuki, Y.Kuniyoshi, IW.Inaba, H.Inoue, Behavior Acquisition of Humanoid Robot by Imitation, The Robotics Society of Japan, pp , sep.18-20: [15] H.Kozima, H.Yano, A Robot that Learns to Communicate with Human Caregivers, The First lnternational Workshop on Epigenetic Robotics, Lund, Sweden, [16] H.Kozima, Eric Vatikiotis-Bateson, Communicative Criteria for Processing Time/Space-Varying Information, International Workshop on Robot and Human Interactive Communication (ROMA N-2001, Paris, France), [17] M.Imai, T.Ono, H.Ishiguro, Physical Relation and Expression:Joint Attention for Human-Robot Interaction, RO-MAN2001, pp , [18] T.Nishimura, S.Nozaki, H.Yabe, R.Oka, Interactive Motion Dialogue between Human Gesture and Movement of Mobile Robot, The Japanese Society for Artificial Intelligence, SIG-Cll-2000-JULY-06: pp , [19] S.Takase, K.Hanahara, Y.Tada, A Study of Robot Arm Imitating Human Arm Postures Based on Visual Information ~ The Society of Instrument and Control Engineers,, pp , [20] Y.Miyake, Man-Machine Communication as Co- Generation Process,Human Interface Society ~ pp , [all T.Komatsu, K.Suzuki, K.Ueda, K.Hiraki, N.Oka Speech Meaning Acquisition Model by Interaction with its User,Human Interface Society, pp ,

HRP-2W: A Humanoid Platform for Research on Support Behavior in Daily life Environments

HRP-2W: A Humanoid Platform for Research on Support Behavior in Daily life Environments Book Title Book Editors IOS Press, 2003 1 HRP-2W: A Humanoid Platform for Research on Support Behavior in Daily life Environments Tetsunari Inamura a,1, Masayuki Inaba a and Hirochika Inoue a a Dept. of

More information

Development of an Interactive Humanoid Robot Robovie - An interdisciplinary research approach between cognitive science and robotics -

Development of an Interactive Humanoid Robot Robovie - An interdisciplinary research approach between cognitive science and robotics - Development of an Interactive Humanoid Robot Robovie - An interdisciplinary research approach between cognitive science and robotics - Hiroshi Ishiguro 1,2, Tetsuo Ono 1, Michita Imai 1, Takayuki Kanda

More information

Body Movement Analysis of Human-Robot Interaction

Body Movement Analysis of Human-Robot Interaction Body Movement Analysis of Human-Robot Interaction Takayuki Kanda, Hiroshi Ishiguro, Michita Imai, and Tetsuo Ono ATR Intelligent Robotics & Communication Laboratories 2-2-2 Hikaridai, Seika-cho, Soraku-gun,

More information

Adaptive Human-Robot Interaction System using Interactive EC

Adaptive Human-Robot Interaction System using Interactive EC Adaptive Human-Robot Interaction System using Interactive EC Yuki Suga, Chihiro Endo, Daizo Kobayashi, Takeshi Matsumoto, Shigeki Sugano School of Science and Engineering, Waseda Univ.,Tokyo, Japan. {ysuga,

More information

Constructivist Approach to Human-Robot Emotional Communication - Design of Evolutionary Function for WAMOEBA-3 -

Constructivist Approach to Human-Robot Emotional Communication - Design of Evolutionary Function for WAMOEBA-3 - Constructivist Approach to Human-Robot Emotional Communication - Design of Evolutionary Function for WAMOEBA-3 - Yuki SUGA, Hiroaki ARIE,Tetsuya OGATA, and Shigeki SUGANO Humanoid Robotics Institute (HRI),

More information

Interaction rule learning with a human partner based on an imitation faculty with a simple visuo-motor mapping

Interaction rule learning with a human partner based on an imitation faculty with a simple visuo-motor mapping Robotics and Autonomous Systems 54 (2006) 414 418 www.elsevier.com/locate/robot Interaction rule learning with a human partner based on an imitation faculty with a simple visuo-motor mapping Masaki Ogino

More information

SIGVerse - A Simulation Platform for Human-Robot Interaction Jeffrey Too Chuan TAN and Tetsunari INAMURA National Institute of Informatics, Japan The

SIGVerse - A Simulation Platform for Human-Robot Interaction Jeffrey Too Chuan TAN and Tetsunari INAMURA National Institute of Informatics, Japan The SIGVerse - A Simulation Platform for Human-Robot Interaction Jeffrey Too Chuan TAN and Tetsunari INAMURA National Institute of Informatics, Japan The 29 th Annual Conference of The Robotics Society of

More information

Associated Emotion and its Expression in an Entertainment Robot QRIO

Associated Emotion and its Expression in an Entertainment Robot QRIO Associated Emotion and its Expression in an Entertainment Robot QRIO Fumihide Tanaka 1. Kuniaki Noda 1. Tsutomu Sawada 2. Masahiro Fujita 1.2. 1. Life Dynamics Laboratory Preparatory Office, Sony Corporation,

More information

Robotics for Children

Robotics for Children Vol. xx No. xx, pp.1 8, 200x 1 1 2 3 4 Robotics for Children New Directions in Child Education and Therapy Fumihide Tanaka 1,HidekiKozima 2, Shoji Itakura 3 and Kazuo Hiraki 4 Robotics intersects with

More information

REBO: A LIFE-LIKE UNIVERSAL REMOTE CONTROL

REBO: A LIFE-LIKE UNIVERSAL REMOTE CONTROL World Automation Congress 2010 TSI Press. REBO: A LIFE-LIKE UNIVERSAL REMOTE CONTROL SEIJI YAMADA *1 AND KAZUKI KOBAYASHI *2 *1 National Institute of Informatics / The Graduate University for Advanced

More information

Reading human relationships from their interaction with an interactive humanoid robot

Reading human relationships from their interaction with an interactive humanoid robot Reading human relationships from their interaction with an interactive humanoid robot Takayuki Kanda 1 and Hiroshi Ishiguro 1,2 1 ATR, Intelligent Robotics and Communication Laboratories 2-2-2 Hikaridai

More information

Assess how research on the construction of cognitive functions in robotic systems is undertaken in Japan, China, and Korea

Assess how research on the construction of cognitive functions in robotic systems is undertaken in Japan, China, and Korea Sponsor: Assess how research on the construction of cognitive functions in robotic systems is undertaken in Japan, China, and Korea Understand the relationship between robotics and the human-centered sciences

More information

Motion Behavior and its Influence on Human-likeness in an Android Robot

Motion Behavior and its Influence on Human-likeness in an Android Robot Motion Behavior and its Influence on Human-likeness in an Android Robot Michihiro Shimada (michihiro.shimada@ams.eng.osaka-u.ac.jp) Asada Project, ERATO, Japan Science and Technology Agency Department

More information

ISMCR2004. Abstract. 2. The mechanism of the master-slave arm of Telesar II. 1. Introduction. D21-Page 1

ISMCR2004. Abstract. 2. The mechanism of the master-slave arm of Telesar II. 1. Introduction. D21-Page 1 Development of Multi-D.O.F. Master-Slave Arm with Bilateral Impedance Control for Telexistence Riichiro Tadakuma, Kiyohiro Sogen, Hiroyuki Kajimoto, Naoki Kawakami, and Susumu Tachi 7-3-1 Hongo, Bunkyo-ku,

More information

Cooperative embodied communication emerged by interactive humanoid robots

Cooperative embodied communication emerged by interactive humanoid robots Int. J. Human-Computer Studies 62 (2005) 247 265 www.elsevier.com/locate/ijhcs Cooperative embodied communication emerged by interactive humanoid robots Daisuke Sakamoto a,b,, Takayuki Kanda b, Tetsuo

More information

Nobutsuna Endo 1, Shimpei Momoki 1, Massimiliano Zecca 2,3, Minoru Saito 1, Yu Mizoguchi 1, Kazuko Itoh 3,5, and Atsuo Takanishi 2,4,5

Nobutsuna Endo 1, Shimpei Momoki 1, Massimiliano Zecca 2,3, Minoru Saito 1, Yu Mizoguchi 1, Kazuko Itoh 3,5, and Atsuo Takanishi 2,4,5 2008 IEEE International Conference on Robotics and Automation Pasadena, CA, USA, May 19-23, 2008 Development of Whole-body Emotion Expression Humanoid Robot Nobutsuna Endo 1, Shimpei Momoki 1, Massimiliano

More information

Does the Appearance of a Robot Affect Users Ways of Giving Commands and Feedback?

Does the Appearance of a Robot Affect Users Ways of Giving Commands and Feedback? 19th IEEE International Symposium on Robot and Human Interactive Communication Principe di Piemonte - Viareggio, Italy, Sept. 12-15, 2010 Does the Appearance of a Robot Affect Users Ways of Giving Commands

More information

HAND-SHAPED INTERFACE FOR INTUITIVE HUMAN- ROBOT COMMUNICATION THROUGH HAPTIC MEDIA

HAND-SHAPED INTERFACE FOR INTUITIVE HUMAN- ROBOT COMMUNICATION THROUGH HAPTIC MEDIA HAND-SHAPED INTERFACE FOR INTUITIVE HUMAN- ROBOT COMMUNICATION THROUGH HAPTIC MEDIA RIKU HIKIJI AND SHUJI HASHIMOTO Department of Applied Physics, School of Science and Engineering, Waseda University 3-4-1

More information

Distributed Vision System: A Perceptual Information Infrastructure for Robot Navigation

Distributed Vision System: A Perceptual Information Infrastructure for Robot Navigation Distributed Vision System: A Perceptual Information Infrastructure for Robot Navigation Hiroshi Ishiguro Department of Information Science, Kyoto University Sakyo-ku, Kyoto 606-01, Japan E-mail: ishiguro@kuis.kyoto-u.ac.jp

More information

Informing a User of Robot s Mind by Motion

Informing a User of Robot s Mind by Motion Informing a User of Robot s Mind by Motion Kazuki KOBAYASHI 1 and Seiji YAMADA 2,1 1 The Graduate University for Advanced Studies 2-1-2 Hitotsubashi, Chiyoda, Tokyo 101-8430 Japan kazuki@grad.nii.ac.jp

More information

Head motion synchronization in the process of consensus building

Head motion synchronization in the process of consensus building Proceedings of the 2013 IEEE/SICE International Symposium on System Integration, Kobe International Conference Center, Kobe, Japan, December 15-17, SA1-K.4 Head motion synchronization in the process of

More information

Analysis of humanoid appearances in human-robot interaction

Analysis of humanoid appearances in human-robot interaction Analysis of humanoid appearances in human-robot interaction Takayuki Kanda, Takahiro Miyashita, Taku Osada 2, Yuji Haikawa 2, Hiroshi Ishiguro &3 ATR Intelligent Robotics and Communication Labs. 2 Honda

More information

Affordance based Human Motion Synthesizing System

Affordance based Human Motion Synthesizing System Affordance based Human Motion Synthesizing System H. Ishii, N. Ichiguchi, D. Komaki, H. Shimoda and H. Yoshikawa Graduate School of Energy Science Kyoto University Uji-shi, Kyoto, 611-0011, Japan Abstract

More information

Knowledge-Based Person-Centric Human-Robot Interaction Using Facial and Hand Gestures

Knowledge-Based Person-Centric Human-Robot Interaction Using Facial and Hand Gestures Knowledge-Based Person-Centric Human-Robot Interaction Using Facial and Hand Gestures Md. Hasanuzzaman*, T. Zhang*, V. Ampornaramveth*, H. Gotoda *, Y. Shirai**, H. Ueno* *Intelligent System Research Division,

More information

BODILY NON-VERBAL INTERACTION WITH VIRTUAL CHARACTERS

BODILY NON-VERBAL INTERACTION WITH VIRTUAL CHARACTERS KEER2010, PARIS MARCH 2-4 2010 INTERNATIONAL CONFERENCE ON KANSEI ENGINEERING AND EMOTION RESEARCH 2010 BODILY NON-VERBAL INTERACTION WITH VIRTUAL CHARACTERS Marco GILLIES *a a Department of Computing,

More information

Intent Imitation using Wearable Motion Capturing System with On-line Teaching of Task Attention

Intent Imitation using Wearable Motion Capturing System with On-line Teaching of Task Attention Intent Imitation using Wearable Motion Capturing System with On-line Teaching of Task Attention Tetsunari Inamura, Naoki Kojo, Tomoyuki Sonoda, Kazuyuki Sakamoto, Kei Okada and Masayuki Inaba Department

More information

A WOZ Environment for Studying Mutual Adaptive Behaviors in Gesture-based Human-robot Interaction

A WOZ Environment for Studying Mutual Adaptive Behaviors in Gesture-based Human-robot Interaction A WOZ Environment for Studying Mutual Adaptive Behaviors in Gesture-based Human-robot Interaction Yong XU, Shinpei TAKEDA and Toyoaki NISHIDA Graduate School of Informatics, Kyoto University Yoshida-Honmachi,

More information

Natural Interaction with Social Robots

Natural Interaction with Social Robots Workshop: Natural Interaction with Social Robots Part of the Topig Group with the same name. http://homepages.stca.herts.ac.uk/~comqkd/tg-naturalinteractionwithsocialrobots.html organized by Kerstin Dautenhahn,

More information

Concept and Architecture of a Centaur Robot

Concept and Architecture of a Centaur Robot Concept and Architecture of a Centaur Robot Satoshi Tsuda, Yohsuke Oda, Kuniya Shinozaki, and Ryohei Nakatsu Kwansei Gakuin University, School of Science and Technology 2-1 Gakuen, Sanda, 669-1337 Japan

More information

Live Feeling on Movement of an Autonomous Robot Using a Biological Signal

Live Feeling on Movement of an Autonomous Robot Using a Biological Signal Live Feeling on Movement of an Autonomous Robot Using a Biological Signal Shigeru Sakurazawa, Keisuke Yanagihara, Yasuo Tsukahara, Hitoshi Matsubara Future University-Hakodate, System Information Science,

More information

A Constructive Approach for Communication Robots. Takayuki Kanda

A Constructive Approach for Communication Robots. Takayuki Kanda A Constructive Approach for Communication Robots Takayuki Kanda Abstract In the past several years, many humanoid robots have been developed based on the most advanced robotics technologies. If these

More information

Development of a Robot Quizmaster with Auditory Functions for Speech-based Multiparty Interaction

Development of a Robot Quizmaster with Auditory Functions for Speech-based Multiparty Interaction Proceedings of the 2014 IEEE/SICE International Symposium on System Integration, Chuo University, Tokyo, Japan, December 13-15, 2014 SaP2A.5 Development of a Robot Quizmaster with Auditory Functions for

More information

Concept and Architecture of a Centaur Robot

Concept and Architecture of a Centaur Robot Concept and Architecture of a Centaur Robot Satoshi Tsuda, Yohsuke Oda, Kuniya Shinozaki, and Ryohei Nakatsu Kwansei Gakuin University, School of Science and Technology 2-1 Gakuen, Sanda, 669-1337 Japan

More information

Development and Evaluation of a Centaur Robot

Development and Evaluation of a Centaur Robot Development and Evaluation of a Centaur Robot 1 Satoshi Tsuda, 1 Kuniya Shinozaki, and 2 Ryohei Nakatsu 1 Kwansei Gakuin University, School of Science and Technology 2-1 Gakuen, Sanda, 669-1337 Japan {amy65823,

More information

Online Knowledge Acquisition and General Problem Solving in a Real World by Humanoid Robots

Online Knowledge Acquisition and General Problem Solving in a Real World by Humanoid Robots Online Knowledge Acquisition and General Problem Solving in a Real World by Humanoid Robots Naoya Makibuchi 1, Furao Shen 2, and Osamu Hasegawa 1 1 Department of Computational Intelligence and Systems

More information

ENHANCED HUMAN-AGENT INTERACTION: AUGMENTING INTERACTION MODELS WITH EMBODIED AGENTS BY SERAFIN BENTO. MASTER OF SCIENCE in INFORMATION SYSTEMS

ENHANCED HUMAN-AGENT INTERACTION: AUGMENTING INTERACTION MODELS WITH EMBODIED AGENTS BY SERAFIN BENTO. MASTER OF SCIENCE in INFORMATION SYSTEMS BY SERAFIN BENTO MASTER OF SCIENCE in INFORMATION SYSTEMS Edmonton, Alberta September, 2015 ABSTRACT The popularity of software agents demands for more comprehensive HAI design processes. The outcome of

More information

Understanding the Mechanism of Sonzai-Kan

Understanding the Mechanism of Sonzai-Kan Understanding the Mechanism of Sonzai-Kan ATR Intelligent Robotics and Communication Laboratories Where does the Sonzai-Kan, the feeling of one's presence, such as the atmosphere, the authority, come from?

More information

Rapid Development System for Humanoid Vision-based Behaviors with Real-Virtual Common Interface

Rapid Development System for Humanoid Vision-based Behaviors with Real-Virtual Common Interface Rapid Development System for Humanoid Vision-based Behaviors with Real-Virtual Common Interface Kei Okada 1, Yasuyuki Kino 1, Fumio Kanehiro 2, Yasuo Kuniyoshi 1, Masayuki Inaba 1, Hirochika Inoue 1 1

More information

Sensor system of a small biped entertainment robot

Sensor system of a small biped entertainment robot Advanced Robotics, Vol. 18, No. 10, pp. 1039 1052 (2004) VSP and Robotics Society of Japan 2004. Also available online - www.vsppub.com Sensor system of a small biped entertainment robot Short paper TATSUZO

More information

Development of an Automatic Camera Control System for Videoing a Normal Classroom to Realize a Distant Lecture

Development of an Automatic Camera Control System for Videoing a Normal Classroom to Realize a Distant Lecture Development of an Automatic Camera Control System for Videoing a Normal Classroom to Realize a Distant Lecture Akira Suganuma Depertment of Intelligent Systems, Kyushu University, 6 1, Kasuga-koen, Kasuga,

More information

CONTACT SENSING APPROACH IN HUMANOID ROBOT NAVIGATION

CONTACT SENSING APPROACH IN HUMANOID ROBOT NAVIGATION Contact Sensing Approach In Humanoid Robot Navigation CONTACT SENSING APPROACH IN HUMANOID ROBOT NAVIGATION Hanafiah, Y. 1, Ohka, M 2., Yamano, M 3., and Nasu, Y. 4 1, 2 Graduate School of Information

More information

Making a Mobile Robot to Express its Mind by Motion Overlap

Making a Mobile Robot to Express its Mind by Motion Overlap 7 Making a Mobile Robot to Express its Mind by Motion Overlap Kazuki Kobayashi 1 and Seiji Yamada 2 1 Shinshu University, 2 National Institute of Informatics Japan 1. Introduction Various home robots like

More information

Stabilize humanoid robot teleoperated by a RGB-D sensor

Stabilize humanoid robot teleoperated by a RGB-D sensor Stabilize humanoid robot teleoperated by a RGB-D sensor Andrea Bisson, Andrea Busatto, Stefano Michieletto, and Emanuele Menegatti Intelligent Autonomous Systems Lab (IAS-Lab) Department of Information

More information

MIN-Fakultät Fachbereich Informatik. Universität Hamburg. Socially interactive robots. Christine Upadek. 29 November Christine Upadek 1

MIN-Fakultät Fachbereich Informatik. Universität Hamburg. Socially interactive robots. Christine Upadek. 29 November Christine Upadek 1 Christine Upadek 29 November 2010 Christine Upadek 1 Outline Emotions Kismet - a sociable robot Outlook Christine Upadek 2 Denition Social robots are embodied agents that are part of a heterogeneous group:

More information

Analyzing the Human-Robot Interaction Abilities of a General-Purpose Social Robot in Different Naturalistic Environments

Analyzing the Human-Robot Interaction Abilities of a General-Purpose Social Robot in Different Naturalistic Environments Analyzing the Human-Robot Interaction Abilities of a General-Purpose Social Robot in Different Naturalistic Environments J. Ruiz-del-Solar 1,2, M. Mascaró 1, M. Correa 1,2, F. Bernuy 1, R. Riquelme 1,

More information

Experimental Investigation into Influence of Negative Attitudes toward Robots on Human Robot Interaction

Experimental Investigation into Influence of Negative Attitudes toward Robots on Human Robot Interaction Experimental Investigation into Influence of Negative Attitudes toward Robots on Human Robot Interaction Tatsuya Nomura 1,2 1 Department of Media Informatics, Ryukoku University 1 5, Yokotani, Setaohe

More information

Generating Personality Character in a Face Robot through Interaction with Human

Generating Personality Character in a Face Robot through Interaction with Human Generating Personality Character in a Face Robot through Interaction with Human F. Iida, M. Tabata and F. Hara Department of Mechanical Engineering Science University of Tokyo - Kagurazaka, Shinjuku-ku,

More information

Interactive Humanoid Robots for a Science Museum

Interactive Humanoid Robots for a Science Museum Interactive Humanoid Robots for a Science Museum Masahiro Shiomi 1,2 Takayuki Kanda 2 Hiroshi Ishiguro 1,2 Norihiro Hagita 2 1 Osaka University 2 ATR IRC Laboratories Osaka 565-0871 Kyoto 619-0288 Japan

More information

Autonomic gaze control of avatars using voice information in virtual space voice chat system

Autonomic gaze control of avatars using voice information in virtual space voice chat system Autonomic gaze control of avatars using voice information in virtual space voice chat system Kinya Fujita, Toshimitsu Miyajima and Takashi Shimoji Tokyo University of Agriculture and Technology 2-24-16

More information

Preliminary Investigation of Moral Expansiveness for Robots*

Preliminary Investigation of Moral Expansiveness for Robots* Preliminary Investigation of Moral Expansiveness for Robots* Tatsuya Nomura, Member, IEEE, Kazuki Otsubo, and Takayuki Kanda, Member, IEEE Abstract To clarify whether humans can extend moral care and consideration

More information

(12) United States Patent

(12) United States Patent (12) United States Patent US006604022B2 (10) Patent N0.: US 6,604,022 B2 Parker et al. (45) Date of Patent: *Aug. 5, 2003 (54) ROBOT FOR AUTONOMOUS OPERATION (56) References Cited (75) Inventors: Andrew

More information

Essay on A Survey of Socially Interactive Robots Authors: Terrence Fong, Illah Nourbakhsh, Kerstin Dautenhahn Summarized by: Mehwish Alam

Essay on A Survey of Socially Interactive Robots Authors: Terrence Fong, Illah Nourbakhsh, Kerstin Dautenhahn Summarized by: Mehwish Alam 1 Introduction Essay on A Survey of Socially Interactive Robots Authors: Terrence Fong, Illah Nourbakhsh, Kerstin Dautenhahn Summarized by: Mehwish Alam 1.1 Social Robots: Definition: Social robots are

More information

Flexible Cooperation between Human and Robot by interpreting Human Intention from Gaze Information

Flexible Cooperation between Human and Robot by interpreting Human Intention from Gaze Information Proceedings of 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems September 28 - October 2, 2004, Sendai, Japan Flexible Cooperation between Human and Robot by interpreting Human

More information

Android (Child android)

Android (Child android) Social and ethical issue Why have I developed the android? Hiroshi ISHIGURO Department of Adaptive Machine Systems, Osaka University ATR Intelligent Robotics and Communications Laboratories JST ERATO Asada

More information

Person Identification and Interaction of Social Robots by Using Wireless Tags

Person Identification and Interaction of Social Robots by Using Wireless Tags Person Identification and Interaction of Social Robots by Using Wireless Tags Takayuki Kanda 1, Takayuki Hirano 1, Daniel Eaton 1, and Hiroshi Ishiguro 1&2 1 ATR Intelligent Robotics and Communication

More information

Human Robotics Interaction (HRI) based Analysis using DMT

Human Robotics Interaction (HRI) based Analysis using DMT Human Robotics Interaction (HRI) based Analysis using DMT Rimmy Chuchra 1 and R. K. Seth 2 1 Department of Computer Science and Engineering Sri Sai College of Engineering and Technology, Manawala, Amritsar

More information

EDUCATION ACADEMIC DEGREE

EDUCATION ACADEMIC DEGREE Akihiko YAMAGUCHI Address: Nara Institute of Science and Technology, 8916-5, Takayama-cho, Ikoma-shi, Nara, JAPAN 630-0192 Phone: +81-(0)743-72-5376 E-mail: akihiko-y@is.naist.jp EDUCATION 2002.4.1-2006.3.24:

More information

Computer Vision in Human-Computer Interaction

Computer Vision in Human-Computer Interaction Invited talk in 2010 Autumn Seminar and Meeting of Pattern Recognition Society of Finland, M/S Baltic Princess, 26.11.2010 Computer Vision in Human-Computer Interaction Matti Pietikäinen Machine Vision

More information

Robot Personality from Perceptual Behavior Engine : An Experimental Study

Robot Personality from Perceptual Behavior Engine : An Experimental Study Robot Personality from Perceptual Behavior Engine : An Experimental Study Dongwook Shin, Jangwon Lee, Hun-Sue Lee and Sukhan Lee School of Information and Communication Engineering Sungkyunkwan University

More information

Evaluation of a Tricycle-style Teleoperational Interface for Children: a Comparative Experiment with a Video Game Controller

Evaluation of a Tricycle-style Teleoperational Interface for Children: a Comparative Experiment with a Video Game Controller 2012 IEEE RO-MAN: The 21st IEEE International Symposium on Robot and Human Interactive Communication. September 9-13, 2012. Paris, France. Evaluation of a Tricycle-style Teleoperational Interface for Children:

More information

Implications on Humanoid Robots in Pedagogical Applications from Cross-Cultural Analysis between Japan, Korea, and the USA

Implications on Humanoid Robots in Pedagogical Applications from Cross-Cultural Analysis between Japan, Korea, and the USA Implications on Humanoid Robots in Pedagogical Applications from Cross-Cultural Analysis between Japan, Korea, and the USA Tatsuya Nomura,, No Member, Takayuki Kanda, Member, IEEE, Tomohiro Suzuki, No

More information

Toward an Augmented Reality System for Violin Learning Support

Toward an Augmented Reality System for Violin Learning Support Toward an Augmented Reality System for Violin Learning Support Hiroyuki Shiino, François de Sorbier, and Hideo Saito Graduate School of Science and Technology, Keio University, Yokohama, Japan {shiino,fdesorbi,saito}@hvrl.ics.keio.ac.jp

More information

Prediction and Correction Algorithm for a Gesture Controlled Robotic Arm

Prediction and Correction Algorithm for a Gesture Controlled Robotic Arm Prediction and Correction Algorithm for a Gesture Controlled Robotic Arm Pushkar Shukla 1, Shehjar Safaya 2, Utkarsh Sharma 3 B.Tech, College of Engineering Roorkee, Roorkee, India 1 B.Tech, College of

More information

Android as a Telecommunication Medium with a Human-like Presence

Android as a Telecommunication Medium with a Human-like Presence Android as a Telecommunication Medium with a Human-like Presence Daisuke Sakamoto 1&2, Takayuki Kanda 1, Tetsuo Ono 1&2, Hiroshi Ishiguro 1&3, Norihiro Hagita 1 1 ATR Intelligent Robotics Laboratories

More information

Application of network robots to a science museum

Application of network robots to a science museum Application of network robots to a science museum Takayuki Kanda 1 Masahiro Shiomi 1,2 Hiroshi Ishiguro 1,2 Norihiro Hagita 1 1 ATR IRC Laboratories 2 Osaka University Kyoto 619-0288 Osaka 565-0871 Japan

More information

CB 2 : A Child Robot with Biomimetic Body for Cognitive Developmental Robotics

CB 2 : A Child Robot with Biomimetic Body for Cognitive Developmental Robotics CB 2 : A Child Robot with Biomimetic Body for Cognitive Developmental Robotics Takashi Minato #1, Yuichiro Yoshikawa #2, Tomoyuki da 3, Shuhei Ikemoto 4, Hiroshi Ishiguro # 5, and Minoru Asada # 6 # Asada

More information

Human Robot Dialogue Interaction. Barry Lumpkin

Human Robot Dialogue Interaction. Barry Lumpkin Human Robot Dialogue Interaction Barry Lumpkin Robots Where to Look: A Study of Human- Robot Engagement Why embodiment? Pure vocal and virtual agents can hold a dialogue Physical robots come with many

More information

Converting Motion between Different Types of Humanoid Robots Using Genetic Algorithms

Converting Motion between Different Types of Humanoid Robots Using Genetic Algorithms Converting Motion between Different Types of Humanoid Robots Using Genetic Algorithms Mari Nishiyama and Hitoshi Iba Abstract The imitation between different types of robots remains an unsolved task for

More information

Evolutionary Computation and Machine Intelligence

Evolutionary Computation and Machine Intelligence Evolutionary Computation and Machine Intelligence Prabhas Chongstitvatana Chulalongkorn University necsec 2005 1 What is Evolutionary Computation What is Machine Intelligence How EC works Learning Robotics

More information

Optic Flow Based Skill Learning for A Humanoid to Trap, Approach to, and Pass a Ball

Optic Flow Based Skill Learning for A Humanoid to Trap, Approach to, and Pass a Ball Optic Flow Based Skill Learning for A Humanoid to Trap, Approach to, and Pass a Ball Masaki Ogino 1, Masaaki Kikuchi 1, Jun ichiro Ooga 1, Masahiro Aono 1 and Minoru Asada 1,2 1 Dept. of Adaptive Machine

More information

Interaction Debugging: an Integral Approach to Analyze Human-Robot Interaction

Interaction Debugging: an Integral Approach to Analyze Human-Robot Interaction Interaction Debugging: an Integral Approach to Analyze Human-Robot Interaction Tijn Kooijmans 1,2 Takayuki Kanda 1 Christoph Bartneck 2 Hiroshi Ishiguro 1,3 Norihiro Hagita 1 1 ATR Intelligent Robotics

More information

The effect of gaze behavior on the attitude towards humanoid robots

The effect of gaze behavior on the attitude towards humanoid robots The effect of gaze behavior on the attitude towards humanoid robots Bachelor Thesis Date: 27-08-2012 Author: Stefan Patelski Supervisors: Raymond H. Cuijpers, Elena Torta Human Technology Interaction Group

More information

Face Registration Using Wearable Active Vision Systems for Augmented Memory

Face Registration Using Wearable Active Vision Systems for Augmented Memory DICTA2002: Digital Image Computing Techniques and Applications, 21 22 January 2002, Melbourne, Australia 1 Face Registration Using Wearable Active Vision Systems for Augmented Memory Takekazu Kato Takeshi

More information

Jane Li. Assistant Professor Mechanical Engineering Department, Robotic Engineering Program Worcester Polytechnic Institute

Jane Li. Assistant Professor Mechanical Engineering Department, Robotic Engineering Program Worcester Polytechnic Institute Jane Li Assistant Professor Mechanical Engineering Department, Robotic Engineering Program Worcester Polytechnic Institute State one reason for investigating and building humanoid robot (4 pts) List two

More information

Physical and Affective Interaction between Human and Mental Commit Robot

Physical and Affective Interaction between Human and Mental Commit Robot Proceedings of the 21 IEEE International Conference on Robotics & Automation Seoul, Korea May 21-26, 21 Physical and Affective Interaction between Human and Mental Commit Robot Takanori Shibata Kazuo Tanie

More information

Drumming with a Humanoid Robot: Lessons Learnt from Designing and Analysing Human-Robot Interaction Studies

Drumming with a Humanoid Robot: Lessons Learnt from Designing and Analysing Human-Robot Interaction Studies Drumming with a Humanoid Robot: Lessons Learnt from Designing and Analysing Human-Robot Interaction Studies Hatice Kose-Bagci, Kerstin Dautenhahn, and Chrystopher L. Nehaniv Adaptive Systems Research Group

More information

Adaptive Humanoid Robot Arm Motion Generation by Evolved Neural Controllers

Adaptive Humanoid Robot Arm Motion Generation by Evolved Neural Controllers Proceedings of the 3 rd International Conference on Mechanical Engineering and Mechatronics Prague, Czech Republic, August 14-15, 2014 Paper No. 170 Adaptive Humanoid Robot Arm Motion Generation by Evolved

More information

Learning Interaction Rules through Compression of Sensori-Motor Causality Space

Learning Interaction Rules through Compression of Sensori-Motor Causality Space Johansson, B.,!ahin, E. & Balkenius, C. (2010). Proceedings of the Tenth International Conference on Epigenetic Robotics: Modeling Cognitive Development in Robotic Systems. Lund University Cognitive Studies,

More information

Extracting Multimodal Dynamics of Objects Using RNNPB

Extracting Multimodal Dynamics of Objects Using RNNPB Paper: Tetsuya Ogata Λ, Hayato Ohba Λ, Jun Tani ΛΛ, Kazunori Komatani Λ, and Hiroshi G. Okuno Λ Λ Graduate School of Informatics, Kyoto University, Kyoto, Japan E-mail: fogata, hayato, komatani, okunog@kuis.kyoto-u.ac.jp

More information

A*STAR Unveils Singapore s First Social Robots at Robocup2010

A*STAR Unveils Singapore s First Social Robots at Robocup2010 MEDIA RELEASE Singapore, 21 June 2010 Total: 6 pages A*STAR Unveils Singapore s First Social Robots at Robocup2010 Visit Suntec City to experience the first social robots - OLIVIA and LUCAS that can see,

More information

Intent Expression Using Eye Robot for Mascot Robot System

Intent Expression Using Eye Robot for Mascot Robot System Intent Expression Using Eye Robot for Mascot Robot System Yoichi Yamazaki, Fangyan Dong, Yuta Masuda, Yukiko Uehara, Petar Kormushev, Hai An Vu, Phuc Quang Le, and Kaoru Hirota Department of Computational

More information

Can a social robot train itself just by observing human interactions?

Can a social robot train itself just by observing human interactions? Can a social robot train itself just by observing human interactions? Dylan F. Glas, Phoebe Liu, Takayuki Kanda, Member, IEEE, Hiroshi Ishiguro, Senior Member, IEEE Abstract In HRI research, game simulations

More information

Development of Video Chat System Based on Space Sharing and Haptic Communication

Development of Video Chat System Based on Space Sharing and Haptic Communication Sensors and Materials, Vol. 30, No. 7 (2018) 1427 1435 MYU Tokyo 1427 S & M 1597 Development of Video Chat System Based on Space Sharing and Haptic Communication Takahiro Hayashi 1* and Keisuke Suzuki

More information

Reading a Robot s Mind: A Model of Utterance Understanding based on the Theory of Mind Mechanism

Reading a Robot s Mind: A Model of Utterance Understanding based on the Theory of Mind Mechanism From: AAAI-00 Proceedings. Copyright 2000, AAAI (www.aaai.org). All rights reserved. Reading a Robot s Mind: A Model of Utterance Understanding based on the Theory of Mind Mechanism Tetsuo Ono Michita

More information

ROMEO Humanoid for Action and Communication. Rodolphe GELIN Aldebaran Robotics

ROMEO Humanoid for Action and Communication. Rodolphe GELIN Aldebaran Robotics ROMEO Humanoid for Action and Communication Rodolphe GELIN Aldebaran Robotics 7 th workshop on Humanoid November Soccer 2012 Robots Osaka, November 2012 Overview French National Project labeled by Cluster

More information

A SURVEY OF SOCIALLY INTERACTIVE ROBOTS

A SURVEY OF SOCIALLY INTERACTIVE ROBOTS A SURVEY OF SOCIALLY INTERACTIVE ROBOTS Terrence Fong, Illah Nourbakhsh, Kerstin Dautenhahn Presented By: Mehwish Alam INTRODUCTION History of Social Robots Social Robots Socially Interactive Robots Why

More information

By Marek Perkowski ECE Seminar, Friday January 26, 2001

By Marek Perkowski ECE Seminar, Friday January 26, 2001 By Marek Perkowski ECE Seminar, Friday January 26, 2001 Why people build Humanoid Robots? Challenge - it is difficult Money - Hollywood, Brooks Fame -?? Everybody? To build future gods - De Garis Forthcoming

More information

Controlling Humanoid Robot Using Head Movements

Controlling Humanoid Robot Using Head Movements Volume-5, Issue-2, April-2015 International Journal of Engineering and Management Research Page Number: 648-652 Controlling Humanoid Robot Using Head Movements S. Mounica 1, A. Naga bhavani 2, Namani.Niharika

More information

Prediction of Human s Movement for Collision Avoidance of Mobile Robot

Prediction of Human s Movement for Collision Avoidance of Mobile Robot Prediction of Human s Movement for Collision Avoidance of Mobile Robot Shunsuke Hamasaki, Yusuke Tamura, Atsushi Yamashita and Hajime Asama Abstract In order to operate mobile robot that can coexist with

More information

Young Children s Folk Knowledge of Robots

Young Children s Folk Knowledge of Robots Young Children s Folk Knowledge of Robots Nobuko Katayama College of letters, Ritsumeikan University 56-1, Tojiin Kitamachi, Kita, Kyoto, 603-8577, Japan E-mail: komorin731@yahoo.co.jp Jun ichi Katayama

More information

Robotic Systems ECE 401RB Fall 2007

Robotic Systems ECE 401RB Fall 2007 The following notes are from: Robotic Systems ECE 401RB Fall 2007 Lecture 14: Cooperation among Multiple Robots Part 2 Chapter 12, George A. Bekey, Autonomous Robots: From Biological Inspiration to Implementation

More information

Virtual Operator in Virtual Control Room: The Prototype System Implementation

Virtual Operator in Virtual Control Room: The Prototype System Implementation Virtual Operator in Virtual Control Room: The Prototype System Implementation H.Shimoda*, H.Ishii*, W.Wu*, D.Li*, T. Nakagawa**, H.Yoshikawa* *Graduate School of Energy Science, Kyoto University Gokasho,

More information

What will the robot do during the final demonstration?

What will the robot do during the final demonstration? SPENCER Questions & Answers What is project SPENCER about? SPENCER is a European Union-funded research project that advances technologies for intelligent robots that operate in human environments. Such

More information

Proactive Conversation between Multiple Robots to Improve the Sense of Human Robot Conversation

Proactive Conversation between Multiple Robots to Improve the Sense of Human Robot Conversation Human-Agent Groups: Studies, Algorithms and Challenges: AAAI Technical Report FS-17-04 Proactive Conversation between Multiple Robots to Improve the Sense of Human Robot Conversation Yuichiro Yoshikawa,

More information

CONTROLLING METHODS AND CHALLENGES OF ROBOTIC ARM

CONTROLLING METHODS AND CHALLENGES OF ROBOTIC ARM CONTROLLING METHODS AND CHALLENGES OF ROBOTIC ARM Aniket D. Kulkarni *1, Dr.Sayyad Ajij D. *2 *1(Student of E&C Department, MIT Aurangabad, India) *2(HOD of E&C department, MIT Aurangabad, India) aniket2212@gmail.com*1,

More information

A Communication Model for Inter-vehicle Communication Simulation Systems Based on Properties of Urban Areas

A Communication Model for Inter-vehicle Communication Simulation Systems Based on Properties of Urban Areas IJCSNS International Journal of Computer Science and Network Security, VO.6 No.10, October 2006 3 A Communication Model for Inter-vehicle Communication Simulation Systems Based on Properties of Urban Areas

More information

Silhouettell: Awareness Support for Real-World Encounter

Silhouettell: Awareness Support for Real-World Encounter In Toru Ishida Ed., Community Computing and Support Systems, Lecture Notes in Computer Science 1519, Springer-Verlag, pp. 317-330, 1998. Silhouettell: Awareness Support for Real-World Encounter Masayuki

More information

GESTURE BASED HUMAN MULTI-ROBOT INTERACTION. Gerard Canal, Cecilio Angulo, and Sergio Escalera

GESTURE BASED HUMAN MULTI-ROBOT INTERACTION. Gerard Canal, Cecilio Angulo, and Sergio Escalera GESTURE BASED HUMAN MULTI-ROBOT INTERACTION Gerard Canal, Cecilio Angulo, and Sergio Escalera Gesture based Human Multi-Robot Interaction Gerard Canal Camprodon 2/27 Introduction Nowadays robots are able

More information

Social Acceptance of Humanoid Robots

Social Acceptance of Humanoid Robots Social Acceptance of Humanoid Robots Tatsuya Nomura Department of Media Informatics, Ryukoku University, Japan nomura@rins.ryukoku.ac.jp 2012/11/29 1 Contents Acceptance of Humanoid Robots Technology Acceptance

More information

Robot: icub This humanoid helps us study the brain

Robot: icub This humanoid helps us study the brain ProfileArticle Robot: icub This humanoid helps us study the brain For the complete profile with media resources, visit: http://education.nationalgeographic.org/news/robot-icub/ Program By Robohub Tuesday,

More information