Physical and Affective Interaction between Human and Mental Commit Robot
|
|
- Marjory Leonard
- 6 years ago
- Views:
Transcription
1 Proceedings of the 21 IEEE International Conference on Robotics & Automation Seoul, Korea May 21-26, 21 Physical and Affective Interaction between Human and Mental Commit Robot Takanori Shibata Kazuo Tanie Institute of Intelligent Systems National Institute of Advanced Industrial Science and Technology Under Ministry of Economy, Trade and Industry Abstract: Recent advances in robotics have been applied to automation in industrial manufacturing, with the primary purpose of optimizing practical systems in terms of such objective measures as accuracy, speed, and cost. This paper introduces research on mental commit robot that seeks a different direction that is not so rigidly dependent on such objective measures. The main goal of this research is to explore a new area in robotics, with an emphasis on human-robot interaction. In the previous research, we introduced a cat robot and evaluated it by interviewing many people. The results showed that physical interaction improved subjective evaluation. However, some subjects gave severe comments on structure of the cat robot when they interacted with it physically. Because of appearance of cat robot, subjects associated with a real cat depending on their own experiences and knowledge, and then compared the robot with real cat. This paper investigates influence of a priori knowledge into subjective interpretation and evaluation of mental commit robot. We developed a new seal robot that has appearance of a baby of harp seal. Most people did not know harp seal precisely nor have experience of interaction with it. Then, subjects evaluated the seal robot while interacting with it. 1. Introduction A human understands people or objects through interaction. The more and longer they interact, the deeper the human understands the other. Long interaction can result in attachment and desire for further interactions. Interaction stimulates humans, and generates motivations for behaviors. Objects with which humans interact include natural objects, animals and artifacts. Studies on interaction between human beings and animals show positive effects on psychology, development of children, and so on [1]. Artifacts that affect people in mentally can be called "aesthetic objects". Such effects are subjective and could not be measured simply in terms of objective measures. Machines are also artifacts. Different from the aesthetic objects, machines have been designed and developed as tools for human beings while being evaluated in terms of objective measures [2]. However, it is necessary for machines that exist and interact with humans to be evaluated by them in terms of their subjective measures. There are many studies on human-machine interaction. Here, we don't discuss studies on human factors in controlling machines used as tools. In other studies, machines recognize human gestures or emotions by sensory information and then act or provide some information to the human. However, modeling gestures or emotions is very difficult because these depend on the situation, context and cultural background of each person. Concerning action by a machine toward a human, an artificial creature in cyber space can give only visual and auditory information to a human. A machine with a physical body is more influential on human mind than a virtual creature. Considerable research on autonomous robots has been carried out. Their purposes are various such as navigation, exploration and delivery in structured or unstructured environments while the robots adapt to the environments. In addition, some robots have been developed to show some emotional expressions by face or gestures [9]. However, although such robots have physical bodies, most of them are not intended to interact physically with a human. We have been building animal type robots in order to investigate human-machine interaction for designing human friendly robots [1, 3-7]. Animal type robots have physical bodies and behave autonomously while generating motivations by themselves. They interact with human beings physically. When we engage physically with a animal type robot, it stimulates our affection. Then we have positive emotions such as happiness and love or negative emotions such as anger and fear. Through physical interaction, we develop attachment to the animal type robot while evaluating it as intelligent or stupid by our subjective measures. In this paper, animal type robots that give mental value to human beings are referred to as mental commit robot /1/$. 21 IEEE 2572
2 Interpretation Interpretation Meaning Interaction Meaning Meaning Value Value Value Human Communication Verbal Explicit Nonverbal Implicit Artifact Machines, Garden, Statues etc. Dynamic Static Embodiment Verbal Explicit Nonverbal Implicit Human Designer Fig. 1 Subjective Interpretation and Evaluation of Artifact through Interaction The chapter 2 discusses subjectivity and objectivity. The chapter 3 discusses subjective interpretation and evaluation of robot through physical interaction. The chapter 4 explains previous research and development of mental commit robot. We categorize appearance of robot into four. The chapter 5 introduces a new seal robot and evaluates it. Then, we will discuss influence of a priori knowledge on models in subjective interpretation and evaluation. Finally, the chapter 6 concludes this paper. 2. Objectivity and Subjectivity Science and technologies have been developed through objectivism. Because of objectivism, people can share and use their scientific and technological knowledge in common. When we design machines, we need to use such objective knowledge. A machine that has high value evaluated in terms of objective measures, is useful as a tool for human beings, especially for automation. A machine that interacts with a human is not always evaluated by such objective measures. People evaluate a machine subjectively. Even if some machines were useless in terms of objective evaluation, some people put high subjective value on them. When we design robots that interact with human beings, we have to consider how people think of the robots subjectively. This paper deals with mental commit robot to investigate subjectivity for designing robots friendly to human beings. 3. Subjective Interpretation and Evaluation of Robot through Physical Interaction When a human interacts with a robot, he perceives it by his sense organs; vision, audition, touch, taste, olfactory, and so on. He interprets meaning of robot s behavior depending on his senses and using his memory and knowledge (Fig. 1). Depending on his subjective interpretation, he evaluates the robot. In a case of a robot in computer graphics (simulation), a human perceives the robot by his vision and audition. Even though precise expression of the robot in computer graphics was presented to the subject, only two modalities of a subject could be stimulated. In order to improve subjective evaluation, the number of modalities as well as quality should be increased. As for real robot, it has a physical body. When a human interact with a robot physically, the human senses the robot in terms of multiple modalities. In the previous research, we investigated subjective interpretation of robot s behaviors by psychological experiments. In the experiments, a picture of a dog was equipped with a tail with one degree of freedom (DOF), and subjects were asked to interpret emotions of the dog by watching wagging tail [4]. Then, a simple tactile sensor was added to the system and the tail wagged depending on stroking the tactile sensor by subjects. In the first experiment, subjects interpreted meaning of wagging by visual and auditory information. In the second one, subjects had tactile information in addition to vision and audition. As the results, the second experiment was much more impressive for most subjects because of physical interaction with tangibility. In addition, interpretations of emotions were various based on knowledge of dogs; for example, some had experience of owning dogs. Therefore, multiple modalities are important in human-robot interaction. In addition, a priori-knowledge influences subjective interpretation. 4. Previous Research and Development of Mental Commit Robot There are four categories in terms of appearance of mental commit robots: 2573
3 Category 1: Human Category 2: Familiar Animal as Pet (pet animals: ex. cat and dog) Category 3: Non-familiar Animal as Pet (ex. seal, penguin, and whale) Category 4: New Character (artificially designed character: ex. AIBO [8], R) We had developed three types of mental commit robots in the previous research. The first was category 4, the second category 3, and the third category Dog Robot in Category 4 The first was a dog robot that had visual, auditory, and tactile sensors, a tail with one DOF, and mobility by three wheels (Fig. 2) [4]. We emphasized tangibility for physical interaction between a human and a robot, different from other research. For this purpose, we developed a new tactile sensor that consisted of a pressure sensor and a balloon covered with artificial skin. The robot was able to sense touch such as pushing, stroking, and patting. It behaved depending on its internal state that consisted of current input from sensors and regressive input from itself. A human interacting with the robot by touching or stroking obtained visual, auditory and tactile information. The human felt softness and nice texture like real creatures by touch. Depending on the information, the human changed his behavior. This loop was considered as coupling between the human and robot. Although the robot didn t have explicit emotion model, people interacting with the robot interpreted that the robot s behaviors were emotional. 4.2 Seal Robot (Version 1) in Category 3 The second was a seal robot (Fig. 3). Seal robot in the previous research had a simple structure in order to investigate emergent emotions through physical interaction. The robot had two legs with two servomotors. Each leg had a clutch bearing at a contacting point with floor. At front and back of its body, the robot had two supports. The front support was a caster. The back support had a clutch bearing at the contacting point with floor. When the robot moved two legs back and forth at the same time, it moved forward like crawling. When it moved two legs back and forth alternately, it didn t move forward but it shook its body. When one leg was fixed at back and the other leg moved back and forth, the robot turned to a direction of the fixed leg s side. Concerning sensory system, the robot had two whiskers that sensed contact with its environment, and two pressure sensors with balloons that sensed pushing, patting and stroking on its body. The robot has 6811 CPU inside to control itself. Its weight was about 1. [kg]. The robot had an internal state depending on sensory information and recurrent information. The input to the state was weighted values of pressure sensors, a value based on whisker sensor, and values of the previous state. The value of internal state was input to a neural oscillator with two neurons. The neural oscillator generated motion pattern of a leg. Phase of the two neurons oscillator was controlled by sensory information. Though it didn t have explicit model of emotions, the robot had some rules to generate its motivation, to change its attention, and to control movement of legs. When people interacted with the robot, they interpreted the robot s behavior differently with some words of emotions to express what the robot was doing. As the movement of the robot depended on context, robot s behaviors were interpreted more complex than the number of given functions. This was the effect of emergent emotions. Complexity of interpretation depended on subjects because it was subjective view. Fig. 2 Dog Robot Fig. 3 Seal Robot (Version 1) Fig. 4 Cat Robot 4.3 Cat Robot in Category 2 The third was cat robot (Fig. 4) [5-7]. Cat robot had more complex structure than the seal robot. It was built by OMRON Corp. The robot had tactile, auditory, and postural sensors to perceive human action and its environment. For tactile sensors, it had piezo-electric force sensors on its back and one on its head, and micro 2574
4 switches at its chin and at its cheek. The robot could recognize stroking, hitting and touching. For audition, the robot had three microphones in its head for sound localization and for speech recognition. For posture, the robot had a 3D gyro-moment sensor. Information was processed by one 32 bit RISC chip, two DSPs for sensors and motors, and one IC for speech recognition. Its weight was 1.5 [kg] including battery. As for action, the cat robot had eight actuators; one for eyelid, two for neck, two for each front leg, and one for a tail. There were one passive joint at each front leg s ankle and two passive joints at each rear legs. Cat robot was expected to behave like living. The robot had some basic behavioral patterns, which mimic those of a real cat. In addition, it had some internal states. In the view of a designer of the robot, the patterns and states could be named with words of emotions when he designed software. We evaluated cat robot by asking some questions to eighty-eight people. They were all Japanese women who liked animals or stuffed animals. Their ages were from twenties to sixties. We had the experiment with one by one. There were four purposes in the interview. The first was to investigate whether cat robot could generate value in subjects mind. The second was whether physical interaction improved subjective value. The third was what factors were important for subjects to think cat robot like living. The fourth was whether subjects admitted value of existence of cat robot. Firstly, we explained about a cat robot to a subject and showed it without movement. Then, we asked the first impression of the cat robot. Secondly, we asked the subject to interact with the cat robot freely for about twenty minutes. Then, we asked the subject some questions. 66% of people answered that cat robot was very cute or cute. 91% wanted to touch and 81% wanted to talk to cat robot. Some were suspicious whether an artificial cat would behave like real cat. In addition, some subjects said that appearance of the cat robot was not cute at all, and that the cat robot looked like just a toy. From these results, appearance is very important for the first impression. Each subject interacted with a cat robot for about thirty minutes. As the first question, we asked each subject whether the cat robot was cute. In the first impression, 66% of the subjects answered positively that the cat robot was very cute or just cute. On the other hand, after interacting with cat robot, the number of positive answers increased from 66% to 74%. This means physical interaction improved subjective value of the cat robot. However, the number of people who answered that the cat robot was very cute decreased from 38% to 32%, although the number of people who answered that the cat robot was cute increased from 28% to 42%. In order to investigate these results, we asked subject reasons why they changed their evaluations. The reasons for decreasing evaluation were as follows: - Texture of fur on the cat robot was not good while touching. - The body was harder than the expectation. - Motors were noisy. - Voice of the cat robot was not cute. - The mouth didn t open. - Etc. On the other hand, the reasons for increasing evaluation were as follows: - The cat robot reacted as if it were real. - Attachment was arisen after physical interaction. - A subject played it without getting bored. - The cat robot made a subject want to take care of it. - Etc. As the results, the reasons for negative change depended on quality of structure and hardware of the cat robot. As subjects had knowledge on real cat very much, they compared the robot with their memory and evaluated it severely. This means a priori-knowledge has much influence in subjective evaluation. 5. New Seal Robot (Version 3) in Category 3 In order to investigate influence of a priori-knowledge, we developed and evaluated a new seal robot that belongs to category 3 (Fig. 5). 5.1 Specifications Its appearance is from a baby of harp seal. It has white fur for three weeks from its born. New seal robot was built by Sankyo Aluminum Industry. As for perception, seal robot has tactile, vision, audition, and posture sensors beneath its soft white artificial fur. For tactile sensors, ten balloon sensors were applied. As for action, it has six actuators; one for eyelid, two for neck, one for each front fin, and one for two rear fins. Weight of seal robot was 3.4 [kg]. The robot has a behavior generation system that consists of hierarchical two layers of processes: proactive and reactive processes. These two layers generate three kinds of behaviors; proactive, reactive, and physiological behaviors: 2575
5 (a) Active (b) Winking Eyes Fig. 5 New Seal Robot (Version 3) (1) Proactive Behaviors: The robot has two layers to generate its proactive behaviors: behavior-planning layer and behavior-generation layer. Considering internal states, stimulus, desire, rhythm and so on, robot generates proactive behaviors. (a) Behavior-planning layer: This has a state transition network based on internal states of robot and robot s desire produced by its internal rhythm. The robot has internal states that can be named with words of emotions. Each state has numerical level and is changed by stimulation. The state decays by time. Interaction changes internal states and creates character of the robot. The behavior-planning layer sends basic behavioral patterns to behavior-generation layer. The basic behavioral patterns include some poses and some motions. Here, although proactive is referred, proactive behaviors are very primitive compared with those of human beings. We implemented similar behaviors of a real seal into the robot. (b) Behavior generation layer: This layer generates control references for each actuator to perform the determined behavior. The control reference depends on strength of internal states and their variation. For example, parameters change speed of movement, and the number of the same behavior. Therefore, although the number of basic patterns is countable, the number of emerging behaviors is uncountable because numeral parameters are various. This creates living like behaviors. In addition, as for attention, the behavior-generation layer adjusts parameters of priority of reactive behaviors and proactive behaviors based on strength of internal states. This function contributes to situated behavior of robots, and makes it difficult for a subject to predict robot s action. (c) Long term memory: The robot has a reinforcement learning function. It has positive value on preferable stimulation such as stroked. It also has negative value on undesirable stimulation such as beaten. The robot put values on relationship between stimulation and behaviors. Gradually, the robot can be shaped to preferable behaviors of its owner. (2) Reactive behaviors: Seal robot reacts to sudden stimulation. For example, when it hears big sound suddenly, the robot pays attention to it and looks at the direction. There are some patterns of combination of stimulation and reaction. These patterns are assumed as conditioned and unconscious behaviors. (3) Physiological behaviors: The robot has a rhythm of a day. It has some spontaneous desires such as sleep based on the rhythm. 5.2 Evaluation Forty people were interviewed (Fig. 6). Their ages were from twenties to fifties. Nobody had experience of interacting with a real baby of harp seal, though one subject knew it very much. Before contacting with seal robot, 75% said seal robot was cute and 9% wanted to touch. They interacted with seal robot about ten minutes. Then, more than 9% said seal robot was cute. 8% said that they felt relaxation with the seal robot. Some of their positive comments were that the body of seal robot was soft and texture was comfortable, and that winking eyes were very adorable. Some of their negative comments were that seal robot was heavy a little, and that it was better if seal robot moved more actively. As the results, since people didn t have a priori-knowledge on a real baby of harp seal, they didn t compare seal robot with real seal nor have severe comments. Fig. 7 shows interaction between a senior woman and a seal robot. She loved the seal robot. After a while, however, she cried. The reason was that the seal robot associated the woman with her dead dog. 5.3 Discussions By comparing evaluations between cat robot in category 2 and seal robot in category 3, a priori knowledge has much influence in subjective 2576
6 interpretation and evaluation. When we design a robot with an appearance of real animals, we have to consider the effect of comparison between sensed impression of robot and associated image by subject s memory. At this point, we investigated this result by interviewing subjects after their interaction with robot in short terms. Therefore, we don t take into account the effect of learning in both subjects and robots. We will investigate it by monitoring long-term interaction between subjects and robots. Ratio (%) Do you know Seal? precisely yes little no Number Ratio (%) Do you want to touch? Yes No Is Seal Robot Cute? Yes No 6. Conclusions We categorized models of robots into four categories. Then, we have developed mental commit robots in categories 2, 3, and 4. This paper introduced a new seal robot in category 3 and evaluated it. While comparing with results of evaluation of robots in previous research, we showed that a priori knowledge on model of robots had much influence in subjective interpretation and evaluation of mental commit robots by short-term interaction. References [1] T. Shibata, et al., Emotional Robot for Intelligent System - Artificial Emotional Creature Project, Proc. of 5th IEEE Int'l Workshop on ROMAN, pp (1996) [2] H. Petroski, Invention by Design, Harvard University Press (1996) [3] T. Shibata and R. Irie, Artificial Emotional Creature for Human-Robot Interaction - A New Direction for Intelligent System, Proc. of the IEEE/ASME Int'l Conf. on AIM'97 (Jun. 1997) paper number 47 and 6 pages in CD-ROM Proc. [4] T. Shibata, et al., Artificial Emotional Creature for Human-Machine Interaction, Proc. of the IEEE Int'l Conf. on SMC, pp (1997) [5] T. Tashima, S. Saito, M. Osumi, T. Kudo and T. Shibata, Interactive Pet Robot with Emotion Model, Proc. of the 16th Annual Conf. of the RSJ, Vol. 1, pp. 11, 12 (1998) [6] T. Shibata, T. Tashima, and K. Tanie, Emergence of Emotional Behavior through Physical Interaction between Human and Robot, Procs. of the 1999 IEEE Int l Conf. on Robotics and Automation (ICRA'99) [7] T. Shibata, T. Tashima (OMRON Corp.), K. Tanie, Subjective Interpretation of Emotional Behavior through Physical Interaction between Human and Robot, Procs. of Systems, Man, and Cybernetics, pp (1999) [8] M. Fujita and K. Kageyama, An Open Architecture for Robot Entertainment, Proc. of Agent 97 (1997) [9] J. M. Hollerbach, Entertainment Robots, Australian Conf. on Robotics & Automation, Brisbane, (1999) Ratio (%) Are you comfortable with Seal Robot? Yes No Fig. 6 Graphs Fig. 7 Interaction between Seal Robot and Senior People 2577
Contents. Mental Commit Robot (Mental Calming Robot) Industrial Robots. In What Way are These Robots Intelligent. Video: Mental Commit Robots
Human Robot Interaction for Psychological Enrichment Dr. Takanori Shibata Senior Research Scientist Intelligent Systems Institute National Institute of Advanced Industrial Science and Technology (AIST)
More informationTabulation and Analysis of Questionnaire Results of Subjective Evaluation of Seal Robot in Seven Countries
Proceedings of the 17th IEEE International Symposium on Robot and Human Interactive Communication, Technische Universität München, Munich, Germany, August 1-3, 2008 Tabulation and Analysis of Questionnaire
More informationAssociated Emotion and its Expression in an Entertainment Robot QRIO
Associated Emotion and its Expression in an Entertainment Robot QRIO Fumihide Tanaka 1. Kuniaki Noda 1. Tsutomu Sawada 2. Masahiro Fujita 1.2. 1. Life Dynamics Laboratory Preparatory Office, Sony Corporation,
More informationA SURVEY OF SOCIALLY INTERACTIVE ROBOTS
A SURVEY OF SOCIALLY INTERACTIVE ROBOTS Terrence Fong, Illah Nourbakhsh, Kerstin Dautenhahn Presented By: Mehwish Alam INTRODUCTION History of Social Robots Social Robots Socially Interactive Robots Why
More informationTouch Perception and Emotional Appraisal for a Virtual Agent
Touch Perception and Emotional Appraisal for a Virtual Agent Nhung Nguyen, Ipke Wachsmuth, Stefan Kopp Faculty of Technology University of Bielefeld 33594 Bielefeld Germany {nnguyen, ipke, skopp}@techfak.uni-bielefeld.de
More informationMIN-Fakultät Fachbereich Informatik. Universität Hamburg. Socially interactive robots. Christine Upadek. 29 November Christine Upadek 1
Christine Upadek 29 November 2010 Christine Upadek 1 Outline Emotions Kismet - a sociable robot Outlook Christine Upadek 2 Denition Social robots are embodied agents that are part of a heterogeneous group:
More informationENHANCED HUMAN-AGENT INTERACTION: AUGMENTING INTERACTION MODELS WITH EMBODIED AGENTS BY SERAFIN BENTO. MASTER OF SCIENCE in INFORMATION SYSTEMS
BY SERAFIN BENTO MASTER OF SCIENCE in INFORMATION SYSTEMS Edmonton, Alberta September, 2015 ABSTRACT The popularity of software agents demands for more comprehensive HAI design processes. The outcome of
More informationRobot: icub This humanoid helps us study the brain
ProfileArticle Robot: icub This humanoid helps us study the brain For the complete profile with media resources, visit: http://education.nationalgeographic.org/news/robot-icub/ Program By Robohub Tuesday,
More informationPerception. Read: AIMA Chapter 24 & Chapter HW#8 due today. Vision
11-25-2013 Perception Vision Read: AIMA Chapter 24 & Chapter 25.3 HW#8 due today visual aural haptic & tactile vestibular (balance: equilibrium, acceleration, and orientation wrt gravity) olfactory taste
More informationEssay on A Survey of Socially Interactive Robots Authors: Terrence Fong, Illah Nourbakhsh, Kerstin Dautenhahn Summarized by: Mehwish Alam
1 Introduction Essay on A Survey of Socially Interactive Robots Authors: Terrence Fong, Illah Nourbakhsh, Kerstin Dautenhahn Summarized by: Mehwish Alam 1.1 Social Robots: Definition: Social robots are
More informationHAND-SHAPED INTERFACE FOR INTUITIVE HUMAN- ROBOT COMMUNICATION THROUGH HAPTIC MEDIA
HAND-SHAPED INTERFACE FOR INTUITIVE HUMAN- ROBOT COMMUNICATION THROUGH HAPTIC MEDIA RIKU HIKIJI AND SHUJI HASHIMOTO Department of Applied Physics, School of Science and Engineering, Waseda University 3-4-1
More informationSensor system of a small biped entertainment robot
Advanced Robotics, Vol. 18, No. 10, pp. 1039 1052 (2004) VSP and Robotics Society of Japan 2004. Also available online - www.vsppub.com Sensor system of a small biped entertainment robot Short paper TATSUZO
More informationArtificial Intelligence
Artificial Intelligence Lecture 01 - Introduction Edirlei Soares de Lima What is Artificial Intelligence? Artificial intelligence is about making computers able to perform the
More informationIntent Expression Using Eye Robot for Mascot Robot System
Intent Expression Using Eye Robot for Mascot Robot System Yoichi Yamazaki, Fangyan Dong, Yuta Masuda, Yukiko Uehara, Petar Kormushev, Hai An Vu, Phuc Quang Le, and Kaoru Hirota Department of Computational
More informationGenerating Personality Character in a Face Robot through Interaction with Human
Generating Personality Character in a Face Robot through Interaction with Human F. Iida, M. Tabata and F. Hara Department of Mechanical Engineering Science University of Tokyo - Kagurazaka, Shinjuku-ku,
More informationRobot Personality from Perceptual Behavior Engine : An Experimental Study
Robot Personality from Perceptual Behavior Engine : An Experimental Study Dongwook Shin, Jangwon Lee, Hun-Sue Lee and Sukhan Lee School of Information and Communication Engineering Sungkyunkwan University
More informationDipartimento di Elettronica Informazione e Bioingegneria Robotics
Dipartimento di Elettronica Informazione e Bioingegneria Robotics Behavioral robotics @ 2014 Behaviorism behave is what organisms do Behaviorism is built on this assumption, and its goal is to promote
More informationREBO: A LIFE-LIKE UNIVERSAL REMOTE CONTROL
World Automation Congress 2010 TSI Press. REBO: A LIFE-LIKE UNIVERSAL REMOTE CONTROL SEIJI YAMADA *1 AND KAZUKI KOBAYASHI *2 *1 National Institute of Informatics / The Graduate University for Advanced
More informationOverview Agents, environments, typical components
Overview Agents, environments, typical components CSC752 Autonomous Robotic Systems Ubbo Visser Department of Computer Science University of Miami January 23, 2017 Outline 1 Autonomous robots 2 Agents
More informationKid-Size Humanoid Soccer Robot Design by TKU Team
Kid-Size Humanoid Soccer Robot Design by TKU Team Ching-Chang Wong, Kai-Hsiang Huang, Yueh-Yang Hu, and Hsiang-Min Chan Department of Electrical Engineering, Tamkang University Tamsui, Taipei, Taiwan E-mail:
More informationDevelopment of an Interactive Humanoid Robot Robovie - An interdisciplinary research approach between cognitive science and robotics -
Development of an Interactive Humanoid Robot Robovie - An interdisciplinary research approach between cognitive science and robotics - Hiroshi Ishiguro 1,2, Tetsuo Ono 1, Michita Imai 1, Takayuki Kanda
More informationRobot Personality based on the Equations of Emotion defined in the 3D Mental Space
Proceedings of the 21 IEEE International Conference on Robotics & Automation Seoul, Korea May 2126, 21 Robot based on the Equations of Emotion defined in the 3D Mental Space Hiroyasu Miwa *, Tomohiko Umetsu
More informationAndroid (Child android)
Social and ethical issue Why have I developed the android? Hiroshi ISHIGURO Department of Adaptive Machine Systems, Osaka University ATR Intelligent Robotics and Communications Laboratories JST ERATO Asada
More informationBehaviour-Based Control. IAR Lecture 5 Barbara Webb
Behaviour-Based Control IAR Lecture 5 Barbara Webb Traditional sense-plan-act approach suggests a vertical (serial) task decomposition Sensors Actuators perception modelling planning task execution motor
More informationMULTI-LAYERED HYBRID ARCHITECTURE TO SOLVE COMPLEX TASKS OF AN AUTONOMOUS MOBILE ROBOT
MULTI-LAYERED HYBRID ARCHITECTURE TO SOLVE COMPLEX TASKS OF AN AUTONOMOUS MOBILE ROBOT F. TIECHE, C. FACCHINETTI and H. HUGLI Institute of Microtechnology, University of Neuchâtel, Rue de Tivoli 28, CH-2003
More informationOptic Flow Based Skill Learning for A Humanoid to Trap, Approach to, and Pass a Ball
Optic Flow Based Skill Learning for A Humanoid to Trap, Approach to, and Pass a Ball Masaki Ogino 1, Masaaki Kikuchi 1, Jun ichiro Ooga 1, Masahiro Aono 1 and Minoru Asada 1,2 1 Dept. of Adaptive Machine
More informationEC The Impressionable Years
University of Nebraska - Lincoln DigitalCommons@University of Nebraska - Lincoln Historical Materials from University of Nebraska- Lincoln Extension Extension 1974 EC74-547 0-3 The Impressionable Years
More informationBirth of An Intelligent Humanoid Robot in Singapore
Birth of An Intelligent Humanoid Robot in Singapore Ming Xie Nanyang Technological University Singapore 639798 Email: mmxie@ntu.edu.sg Abstract. Since 1996, we have embarked into the journey of developing
More informationJane Li. Assistant Professor Mechanical Engineering Department, Robotic Engineering Program Worcester Polytechnic Institute
Jane Li Assistant Professor Mechanical Engineering Department, Robotic Engineering Program Worcester Polytechnic Institute State one reason for investigating and building humanoid robot (4 pts) List two
More informationHMM-based Error Recovery of Dance Step Selection for Dance Partner Robot
27 IEEE International Conference on Robotics and Automation Roma, Italy, 1-14 April 27 ThA4.3 HMM-based Error Recovery of Dance Step Selection for Dance Partner Robot Takahiro Takeda, Yasuhisa Hirata,
More informationDistributed Vision System: A Perceptual Information Infrastructure for Robot Navigation
Distributed Vision System: A Perceptual Information Infrastructure for Robot Navigation Hiroshi Ishiguro Department of Information Science, Kyoto University Sakyo-ku, Kyoto 606-01, Japan E-mail: ishiguro@kuis.kyoto-u.ac.jp
More informationCS494/594: Software for Intelligent Robotics
CS494/594: Software for Intelligent Robotics Spring 2007 Tuesday/Thursday 11:10 12:25 Instructor: Dr. Lynne E. Parker TA: Rasko Pjesivac Outline Overview syllabus and class policies Introduction to class:
More informationInforming a User of Robot s Mind by Motion
Informing a User of Robot s Mind by Motion Kazuki KOBAYASHI 1 and Seiji YAMADA 2,1 1 The Graduate University for Advanced Studies 2-1-2 Hitotsubashi, Chiyoda, Tokyo 101-8430 Japan kazuki@grad.nii.ac.jp
More informationZJUDancer Team Description Paper Humanoid Kid-Size League of Robocup 2015
ZJUDancer Team Description Paper Humanoid Kid-Size League of Robocup 2015 Yu DongDong, Liu Yun, Zhou Chunlin, and Xiong Rong State Key Lab. of Industrial Control Technology, Zhejiang University, Hangzhou,
More informationConstructivist Approach to Human-Robot Emotional Communication - Design of Evolutionary Function for WAMOEBA-3 -
Constructivist Approach to Human-Robot Emotional Communication - Design of Evolutionary Function for WAMOEBA-3 - Yuki SUGA, Hiroaki ARIE,Tetsuya OGATA, and Shigeki SUGANO Humanoid Robotics Institute (HRI),
More informationSwarm Intelligence W7: Application of Machine- Learning Techniques to Automatic Control Design and Optimization
Swarm Intelligence W7: Application of Machine- Learning Techniques to Automatic Control Design and Optimization Learning to avoid obstacles Outline Problem encoding using GA and ANN Floreano and Mondada
More informationBody Movement Analysis of Human-Robot Interaction
Body Movement Analysis of Human-Robot Interaction Takayuki Kanda, Hiroshi Ishiguro, Michita Imai, and Tetsuo Ono ATR Intelligent Robotics & Communication Laboratories 2-2-2 Hikaridai, Seika-cho, Soraku-gun,
More informationRobot: Geminoid F This android robot looks just like a woman
ProfileArticle Robot: Geminoid F This android robot looks just like a woman For the complete profile with media resources, visit: http://education.nationalgeographic.org/news/robot-geminoid-f/ Program
More informationRapid Development System for Humanoid Vision-based Behaviors with Real-Virtual Common Interface
Rapid Development System for Humanoid Vision-based Behaviors with Real-Virtual Common Interface Kei Okada 1, Yasuyuki Kino 1, Fumio Kanehiro 2, Yasuo Kuniyoshi 1, Masayuki Inaba 1, Hirochika Inoue 1 1
More informationComputer Animation of Creatures in a Deep Sea
Computer Animation of Creatures in a Deep Sea Naoya Murakami and Shin-ichi Murakami Olympus Software Technology Corp. Tokyo Denki University ABSTRACT This paper describes an interactive computer animation
More informationA Robust Neural Robot Navigation Using a Combination of Deliberative and Reactive Control Architectures
A Robust Neural Robot Navigation Using a Combination of Deliberative and Reactive Control Architectures D.M. Rojas Castro, A. Revel and M. Ménard * Laboratory of Informatics, Image and Interaction (L3I)
More informationROMEO Humanoid for Action and Communication. Rodolphe GELIN Aldebaran Robotics
ROMEO Humanoid for Action and Communication Rodolphe GELIN Aldebaran Robotics 7 th workshop on Humanoid November Soccer 2012 Robots Osaka, November 2012 Overview French National Project labeled by Cluster
More informationPerson Identification and Interaction of Social Robots by Using Wireless Tags
Person Identification and Interaction of Social Robots by Using Wireless Tags Takayuki Kanda 1, Takayuki Hirano 1, Daniel Eaton 1, and Hiroshi Ishiguro 1&2 1 ATR Intelligent Robotics and Communication
More informationRobo-Erectus Tr-2010 TeenSize Team Description Paper.
Robo-Erectus Tr-2010 TeenSize Team Description Paper. Buck Sin Ng, Carlos A. Acosta Calderon, Nguyen The Loan, Guohua Yu, Chin Hock Tey, Pik Kong Yue and Changjiu Zhou. Advanced Robotics and Intelligent
More informationTHE AI REVOLUTION. How Artificial Intelligence is Redefining Marketing Automation
THE AI REVOLUTION How Artificial Intelligence is Redefining Marketing Automation The implications of Artificial Intelligence for modern day marketers The shift from Marketing Automation to Intelligent
More informationZJUDancer Team Description Paper Humanoid Kid-Size League of Robocup 2014
ZJUDancer Team Description Paper Humanoid Kid-Size League of Robocup 2014 Yu DongDong, Xiang Chuan, Zhou Chunlin, and Xiong Rong State Key Lab. of Industrial Control Technology, Zhejiang University, Hangzhou,
More information- Basics of informatics - Computer network - Software engineering - Intelligent media processing - Human interface. Professor. Professor.
- Basics of informatics - Computer network - Software engineering - Intelligent media processing - Human interface Computer-Aided Engineering Research of power/signal integrity analysis and EMC design
More informationROBOTICS ENG YOUSEF A. SHATNAWI INTRODUCTION
ROBOTICS INTRODUCTION THIS COURSE IS TWO PARTS Mobile Robotics. Locomotion (analogous to manipulation) (Legged and wheeled robots). Navigation and obstacle avoidance algorithms. Robot Vision Sensors and
More informationHaptic presentation of 3D objects in virtual reality for the visually disabled
Haptic presentation of 3D objects in virtual reality for the visually disabled M Moranski, A Materka Institute of Electronics, Technical University of Lodz, Wolczanska 211/215, Lodz, POLAND marcin.moranski@p.lodz.pl,
More informationAffordance based Human Motion Synthesizing System
Affordance based Human Motion Synthesizing System H. Ishii, N. Ichiguchi, D. Komaki, H. Shimoda and H. Yoshikawa Graduate School of Energy Science Kyoto University Uji-shi, Kyoto, 611-0011, Japan Abstract
More informationChapter 2 Introduction to Haptics 2.1 Definition of Haptics
Chapter 2 Introduction to Haptics 2.1 Definition of Haptics The word haptic originates from the Greek verb hapto to touch and therefore refers to the ability to touch and manipulate objects. The haptic
More informationInteraction rule learning with a human partner based on an imitation faculty with a simple visuo-motor mapping
Robotics and Autonomous Systems 54 (2006) 414 418 www.elsevier.com/locate/robot Interaction rule learning with a human partner based on an imitation faculty with a simple visuo-motor mapping Masaki Ogino
More informationAn Emotion Model of 3D Virtual Characters In Intelligent Virtual Environment
An Emotion Model of 3D Virtual Characters In Intelligent Virtual Environment Zhen Liu 1, Zhi Geng Pan 2 1 The Faculty of Information Science and Technology, Ningbo University, 315211, China liuzhen@nbu.edu.cn
More informationBLUE BRAIN - The name of the world s first virtual brain. That means a machine that can function as human brain.
CONTENTS 1~ INTRODUCTION 2~ WHAT IS BLUE BRAIN 3~ WHAT IS VIRTUAL BRAIN 4~ FUNCTION OF NATURAL BRAIN 5~ BRAIN SIMULATION 6~ CURRENT RESEARCH WORK 7~ ADVANTAGES 8~ DISADVANTAGE 9~ HARDWARE AND SOFTWARE
More informationIntelligent Robotics Sensors and Actuators
Intelligent Robotics Sensors and Actuators Luís Paulo Reis (University of Porto) Nuno Lau (University of Aveiro) The Perception Problem Do we need perception? Complexity Uncertainty Dynamic World Detection/Correction
More informationEmotional BWI Segway Robot
Emotional BWI Segway Robot Sangjin Shin https:// github.com/sangjinshin/emotional-bwi-segbot 1. Abstract The Building-Wide Intelligence Project s Segway Robot lacked emotions and personality critical in
More informationIntroduction to Haptics
Introduction to Haptics Roope Raisamo Multimodal Interaction Research Group Tampere Unit for Computer Human Interaction (TAUCHI) Department of Computer Sciences University of Tampere, Finland Definition
More informationDr. Ashish Dutta. Professor, Dept. of Mechanical Engineering Indian Institute of Technology Kanpur, INDIA
Introduction: History of Robotics - past, present and future Dr. Ashish Dutta Professor, Dept. of Mechanical Engineering Indian Institute of Technology Kanpur, INDIA Origin of Automation: replacing human
More informationHome-Care Technology for Independent Living
Independent LifeStyle Assistant Home-Care Technology for Independent Living A NIST Advanced Technology Program Wende Dewing, PhD Human-Centered Systems Information and Decision Technologies Honeywell Laboratories
More informationCreating Computer Games
By the end of this task I should know how to... 1) import graphics (background and sprites) into Scratch 2) make sprites move around the stage 3) create a scoring system using a variable. Creating Computer
More informationDesigning Toys That Come Alive: Curious Robots for Creative Play
Designing Toys That Come Alive: Curious Robots for Creative Play Kathryn Merrick School of Information Technologies and Electrical Engineering University of New South Wales, Australian Defence Force Academy
More informationCreating a 3D environment map from 2D camera images in robotics
Creating a 3D environment map from 2D camera images in robotics J.P. Niemantsverdriet jelle@niemantsverdriet.nl 4th June 2003 Timorstraat 6A 9715 LE Groningen student number: 0919462 internal advisor:
More informationES 492: SCIENCE IN THE MOVIES
UNIVERSITY OF SOUTH ALABAMA ES 492: SCIENCE IN THE MOVIES LECTURE 5: ROBOTICS AND AI PRESENTER: HANNAH BECTON TODAY'S AGENDA 1. Robotics and Real-Time Systems 2. Reacting to the environment around them
More informationQUTIE TOWARD A MULTI-FUNCTIONAL ROBOTIC PLATFORM
QUTIE TOWARD A MULTI-FUNCTIONAL ROBOTIC PLATFORM Matti Tikanmäki, Antti Tikanmäki, Juha Röning. University of Oulu, Computer Engineering Laboratory, Intelligent Systems Group ABSTRACT In this paper we
More informationSubsumption Architecture in Swarm Robotics. Cuong Nguyen Viet 16/11/2015
Subsumption Architecture in Swarm Robotics Cuong Nguyen Viet 16/11/2015 1 Table of content Motivation Subsumption Architecture Background Architecture decomposition Implementation Swarm robotics Swarm
More informationAcquisition of Multi-Modal Expression of Slip through Pick-Up Experiences
Acquisition of Multi-Modal Expression of Slip through Pick-Up Experiences Yasunori Tada* and Koh Hosoda** * Dept. of Adaptive Machine Systems, Osaka University ** Dept. of Adaptive Machine Systems, HANDAI
More informationMOBILE AND UBIQUITOUS HAPTICS
MOBILE AND UBIQUITOUS HAPTICS Jussi Rantala and Jukka Raisamo Tampere Unit for Computer-Human Interaction School of Information Sciences University of Tampere, Finland Contents Haptic communication Affective
More informationKorea Humanoid Robot Projects
Korea Humanoid Robot Projects Jun Ho Oh HUBO Lab., KAIST KOREA Humanoid Projects(~2001) A few humanoid robot projects were existed. Most researches were on dynamic and kinematic simulations for walking
More informationEE631 Cooperating Autonomous Mobile Robots. Lecture 1: Introduction. Prof. Yi Guo ECE Department
EE631 Cooperating Autonomous Mobile Robots Lecture 1: Introduction Prof. Yi Guo ECE Department Plan Overview of Syllabus Introduction to Robotics Applications of Mobile Robots Ways of Operation Single
More informationChapter 2 Intelligent Control System Architectures
Chapter 2 Intelligent Control System Architectures Making realistic robots is going to polarize the market, if you will. You will have some people who love it and some people who will really be disturbed.
More informationLearning Reactive Neurocontrollers using Simulated Annealing for Mobile Robots
Learning Reactive Neurocontrollers using Simulated Annealing for Mobile Robots Philippe Lucidarme, Alain Liégeois LIRMM, University Montpellier II, France, lucidarm@lirmm.fr Abstract This paper presents
More informationIOSR Journal of Engineering (IOSRJEN) e-issn: , p-issn: , Volume 2, Issue 11 (November 2012), PP 37-43
IOSR Journal of Engineering (IOSRJEN) e-issn: 2250-3021, p-issn: 2278-8719, Volume 2, Issue 11 (November 2012), PP 37-43 Operative Precept of robotic arm expending Haptic Virtual System Arnab Das 1, Swagat
More informationDevelopment of PetRo: A Modular Robot for Pet-Like Applications
Development of PetRo: A Modular Robot for Pet-Like Applications Ben Salem * Polywork Ltd., Sheffield Science Park, Cooper Buildings, Arundel Street, Sheffield, S1 2NS, England ABSTRACT We have designed
More informationActive Agent Oriented Multimodal Interface System
Active Agent Oriented Multimodal Interface System Osamu HASEGAWA; Katsunobu ITOU, Takio KURITA, Satoru HAYAMIZU, Kazuyo TANAKA, Kazuhiko YAMAMOTO, and Nobuyuki OTSU Electrotechnical Laboratory 1-1-4 Umezono,
More informationUser Interface Agents
User Interface Agents Roope Raisamo (rr@cs.uta.fi) Department of Computer Sciences University of Tampere http://www.cs.uta.fi/sat/ User Interface Agents Schiaffino and Amandi [2004]: Interface agents are
More informationRealtime 3D Computer Graphics Virtual Reality
Realtime 3D Computer Graphics Virtual Reality Marc Erich Latoschik AI & VR Lab Artificial Intelligence Group University of Bielefeld Virtual Reality (or VR for short) Virtual Reality (or VR for short)
More informationCognitive Media Processing
Cognitive Media Processing 2013-10-15 Nobuaki Minematsu Title of each lecture Theme-1 Multimedia information and humans Multimedia information and interaction between humans and machines Multimedia information
More informationComputer Haptics and Applications
Computer Haptics and Applications EURON Summer School 2003 Cagatay Basdogan, Ph.D. College of Engineering Koc University, Istanbul, 80910 (http://network.ku.edu.tr/~cbasdogan) Resources: EURON Summer School
More informationDesigning the consumer experience
Designing the consumer experience Rick (H.N.J.) Schifferstein Delft University of Technology Challenge the future Pine & Gilmore (1999) 2 Retail experiences 3 4 What is an experience? 5 Framework of Product
More informationHenry Lin, Department of Electrical and Computer Engineering, California State University, Bakersfield Lecture 8 (Robotics) July 25 th, 2012
Henry Lin, Department of Electrical and Computer Engineering, California State University, Bakersfield Lecture 8 (Robotics) July 25 th, 2012 1 2 Robotic Applications in Smart Homes Control of the physical
More informationMeet Pepper. Because of this, Pepper will truly change the way we live our lives.
PRESS KIT Meet Pepper Pepper is a humanoid robot, engaging, surprising and above all kind. Pepper is the first emotional robot. He was not designed for an industrial function, rather to be a true companion
More informationNeural Models for Multi-Sensor Integration in Robotics
Department of Informatics Intelligent Robotics WS 2016/17 Neural Models for Multi-Sensor Integration in Robotics Josip Josifovski 4josifov@informatik.uni-hamburg.de Outline Multi-sensor Integration: Neurally
More informationCognitive Robotics 2017/2018
Cognitive Robotics 2017/2018 Course Introduction Matteo Matteucci matteo.matteucci@polimi.it Artificial Intelligence and Robotics Lab - Politecnico di Milano About me and my lectures Lectures given by
More informationHumanoid robot. Honda's ASIMO, an example of a humanoid robot
Humanoid robot Honda's ASIMO, an example of a humanoid robot A humanoid robot is a robot with its overall appearance based on that of the human body, allowing interaction with made-for-human tools or environments.
More informationA Study on the control Method of 3-Dimensional Space Application using KINECT System Jong-wook Kang, Dong-jun Seo, and Dong-seok Jung,
IJCSNS International Journal of Computer Science and Network Security, VOL.11 No.9, September 2011 55 A Study on the control Method of 3-Dimensional Space Application using KINECT System Jong-wook Kang,
More informationChapter 6 Experiments
72 Chapter 6 Experiments The chapter reports on a series of simulations experiments showing how behavior and environment influence each other, from local interactions between individuals and other elements
More informationThe Third Generation of Robotics: Ubiquitous Robot
The Third Generation of Robotics: Ubiquitous Robot Jong-Hwan Kim, Yong-Duk Kim, and Kang-Hee Lee Robot Intelligence Laboratory, KAIST, Yuseong-gu, Daejeon 305-701, Republic of Korea {johkim, ydkim, khlee}@rit.kaist.ac.kr
More informationLive Feeling on Movement of an Autonomous Robot Using a Biological Signal
Live Feeling on Movement of an Autonomous Robot Using a Biological Signal Shigeru Sakurazawa, Keisuke Yanagihara, Yasuo Tsukahara, Hitoshi Matsubara Future University-Hakodate, System Information Science,
More informationSession 11 Introduction to Robotics and Programming mbot. >_ {Code4Loop}; Roochir Purani
Session 11 Introduction to Robotics and Programming mbot >_ {Code4Loop}; Roochir Purani RECAP from last 2 sessions 3D Programming with Events and Messages Homework Review /Questions Understanding 3D Programming
More informationCognitive robots and emotional intelligence Cloud robotics Ethical, legal and social issues of robotic Construction robots Human activities in many
Preface The jubilee 25th International Conference on Robotics in Alpe-Adria-Danube Region, RAAD 2016 was held in the conference centre of the Best Western Hotel M, Belgrade, Serbia, from 30 June to 2 July
More informationA practical experiment with interactive humanoid robots in a human society
A practical experiment with interactive humanoid robots in a human society Takayuki Kanda 1, Takayuki Hirano 1, Daniel Eaton 1, and Hiroshi Ishiguro 1,2 1 ATR Intelligent Robotics Laboratories, 2-2-2 Hikariai
More informationAFFECTIVE COMPUTING FOR HCI
AFFECTIVE COMPUTING FOR HCI Rosalind W. Picard MIT Media Laboratory 1 Introduction Not all computers need to pay attention to emotions, or to have emotional abilities. Some machines are useful as rigid
More informationEvaluation of Five-finger Haptic Communication with Network Delay
Tactile Communication Haptic Communication Network Delay Evaluation of Five-finger Haptic Communication with Network Delay To realize tactile communication, we clarify some issues regarding how delay affects
More informationARMY RDT&E BUDGET ITEM JUSTIFICATION (R2 Exhibit)
Exhibit R-2 0602308A Advanced Concepts and Simulation ARMY RDT&E BUDGET ITEM JUSTIFICATION (R2 Exhibit) FY 2005 FY 2006 FY 2007 FY 2008 FY 2009 FY 2010 FY 2011 Total Program Element (PE) Cost 22710 27416
More informationGraphical Simulation and High-Level Control of Humanoid Robots
In Proc. 2000 IEEE RSJ Int l Conf. on Intelligent Robots and Systems (IROS 2000) Graphical Simulation and High-Level Control of Humanoid Robots James J. Kuffner, Jr. Satoshi Kagami Masayuki Inaba Hirochika
More informationCheekTouch: An Affective Interaction Technique while Speaking on the Mobile Phone
CheekTouch: An Affective Interaction Technique while Speaking on the Mobile Phone Young-Woo Park Department of Industrial Design, KAIST, Daejeon, Korea pyw@kaist.ac.kr Chang-Young Lim Graduate School of
More informationHybrid Neuro-Fuzzy System for Mobile Robot Reactive Navigation
Hybrid Neuro-Fuzzy ystem for Mobile Robot Reactive Navigation Ayman A. AbuBaker Assistance Prof. at Faculty of Information Technology, Applied cience University, Amman- Jordan, a_abubaker@asu.edu.jo. ABTRACT
More informationExtracting Navigation States from a Hand-Drawn Map
Extracting Navigation States from a Hand-Drawn Map Marjorie Skubic, Pascal Matsakis, Benjamin Forrester and George Chronis Dept. of Computer Engineering and Computer Science, University of Missouri-Columbia,
More informationApplication of Virtual Reality Technology in College Students Mental Health Education
Journal of Physics: Conference Series PAPER OPEN ACCESS Application of Virtual Reality Technology in College Students Mental Health Education To cite this article: Ming Yang 2018 J. Phys.: Conf. Ser. 1087
More informationBiomimetic Design of Actuators, Sensors and Robots
Biomimetic Design of Actuators, Sensors and Robots Takashi Maeno, COE Member of autonomous-cooperative robotics group Department of Mechanical Engineering Keio University Abstract Biological life has greatly
More information