Dynamic Emotion-Based Human-Robot Collaborative Assembly in Manufacturing:The Preliminary Concepts

Size: px
Start display at page:

Download "Dynamic Emotion-Based Human-Robot Collaborative Assembly in Manufacturing:The Preliminary Concepts"

Transcription

1 Dynamic Emotion-Based Human-Robot Collaborative Assembly in Manufacturing:The Preliminary Concepts Rahman S. M. Mizanoor, David Adam Spencer, Xiaotian Wang and Yue Wang Department of Mechanical Engineering Clemson University, Clemson, SC 29634, USA s: {rrahman, Abstract In this paper, we present the preliminary concepts regarding the dynamic emotion-based human-robot collaboration in an assembly task in manufacturing. We employ an anthropomorphic robot with emotion display abilities to collaborate with a human for the assembly task. The human serves as a supervisor and a co-worker when collaborating with the robot for the task. We consider situations when there is neither direct physical contact nor tele-operated interaction between the robot and the human, but they work side by side with each other, share the same objective and working environment. We first identify how the human may be influenced by the robot s emotions when the human collaborates with the robot during the task. We then propose intelligent control algorithms that may help the robot dynamically adjust its emotions with the task situations. We preliminarily investigate the effects of emotions on Human-Robot Interactions (HRI) and on the assembly performance. The results show that the static emotion produces better HRI and assembly performance than that the robot produces with no emotion. The dynamic emotions of the robot may improve HRI in manufacturing operations to enhance manufacturing performances in terms of productivity, efficiency, quality, safety, cost effectiveness etc. Keywords- Dynamic Emotion; Humanoid Manufacturing Robot; Intelligent Control; Human-Robot Collaboration; Intelligent Assembly; Smart Manufacturing I. INTRODUCTION The human shows various emotions through various facial expressions. The common emotions observed in humans are anger, frustration, disgust, fear, distress, calmness, joy, sorrow, surprise, interest, boredom, anxiousness, curiosity, attraction, desire, admiration, sadness, trust, embarrassment, helplessness, powerlessness, worry, doubt, anxiety, shock, stress etc. Each emotion has its antecedent conditions and affects the behaviors and functions of the human as well as that of other humans next to the human. Emotions are dynamic phenomena [1]. Emotions foster face to face human-human communications and help the human understand another human s mental state based on emotions and then may help decide appropriate responses, interactions and behaviors to another human. Some emotions may affect positively and enhance the effectiveness of the communications and interactions, while other emotions may affect the interactions negatively [1]. Social robotics researchers were inspired by the emotions of the humans to develop robots with anthropomorphic appearance and human-like emotions. It was believed that such anthropomorphic appearance and human-like emotional features would make the human-robot interactions and collaborations more effective and realistic [2]-[3]. MIT robot Kismet, Japanese robot Kobian, British robot ERWIN, NAO, icat, Flobi etc. are the early-stage anthropomorphic robots that could exhibit emotions [4]-[6]. Most recently, several anthropomorphic robots with indistinguishable human appearance have been introduced that can exhibit more realistic and human-like emotions. These robots include Hanson robokind robots with realistic human faces, Geminoid robots, Nadine etc. [7]-[8]. However, all of the above state-of-the-art humanoid robots with emotions have been proposed for human-robot social interactions [9]-[10]. So far, there are almost no practical applications of such emotional robots; instead their applications are limited to the laboratory experiments only. Furthermore, the emotions are mostly static and are not dynamically adjustable with that of their human counterparts and with the task situations [9]-[10], which is a hindrance towards the symbiosis in the human-robot collaborations [11]. It is true that some robots can recognize the human s emotions and based on it can change their own emotions dynamically [4]. But, these robots are for social services and their dynamic ability is still very limited. The emotions are also not associated with their functions. On the other hand, most of the robotic devices used in the manufacturing environments collaborating with humans have functional configuration and appearance that lack anthropomorphism and emotions [12]-[13]. We believe that humanoid robots with properly designed dynamic emotional abilities used in manufacturing environments may enhance the human-robot interactions (HRI) and thus may improve the manufacturing performances in various terms such as productivity, efficiency, costs, quality etc. Because, dynamic emotion capability in the robot may help display the robot s mental states based on the task situations that may make the communications between the robot and the human more effective that may in turn enhance the manufacturing performances. However, such anthropomorphic robots with dynamic emotional abilities have never been used in manufacturing environments, and the effects of the dynamic

2 emotions of robots on the manufacturing performances have never been investigated. Hence, we propose an anthropomorphic robot with emotional abilities for collaboration with human in an assembly task in manufacturing. We identify a specific task where the human and the robot collaborate to assemble parts/objects. The human acts as both a supervisor and a co-worker. We propose an algorithm that may be used to make the robot s emotions dynamic based on the task situations. We then explain how the robot s emotions help the human better adjust with the robot. We also propose the experiment methods that may help understand the effects of robot s dynamic emotions on the HRI and on manufacturing performances. As the proof of concepts, we present the preliminary results that compare the HRI between no emotion and static emotion of the robot for the human-robot cooperative assembly task. II. THE MANUFACTURING ROBOT We use a Rethink Robotics Baxter humanoid manufacturing research robot made by Rethink Robotics [14] as shown in Fig.1, which is the first collaborative anthropomorphic robot with emotions used in manufacturing activities. Its both hands can work simultaneously (multitasking arms). Each arm has 7 Degrees-of-Freedom (DoFs). The joints are compliant and enriched with back-driveable motors and force sensors. Through force detection, it can feel contact with objects and work surfaces. Figure 1. The Rethink Baxter robot (loaned from BMW) to be used for the proposed human-robot collaborative assembly in manufacturing. The left-hand end-effectors (grippers) are changeable and are suitable for pick and place (PnP) operations. The right hand has pneumatic end-effector (gripper) that can be used to pick and place very thin papers and plates. Baxter is suitable for material handling, machine tending, testing and sorting, light assembly, finishing operations etc. It has a moveable base and a rotary screen at its head where emotions can be displayed. The robot emotions may be dynamically changed in the screen based on task situations when the robot works with the human coworkers in the manufacturing environments. Base color of the screen can also be adjusted to reflect changes in its emotions. The human co-worker can teach the robot by demonstration, i.e. by moving the robot arms and joints and showing how to perform a manufacturing task. The robot is controlled through the Robot Operating System (ROS) software [15]. III. THE SELECTED HUMAN-ROBOT COLLABORATIVE ASSEMBLY TASK Figure 2. The selected human-robot collaborative assembly task. We select a representative assembly task that is to be performed by the Baxter robot and the human co-worker in collaboration. As shown in Fig.2, the robot at first picks a rectangular object (first object) and places it on a table surface. Then, the human picks another object (second object) and places it on the first object. Then, the robot again picks another object (third object) and places it on the second object. In this task, the human has roles as both a co-worker and a supervisor. The human co-works with the robot for the assembly task as mentioned above. In addition, the human has also supervisory roles such as- 1. The human teaches the robot how to pick and place the object, 2. The human ensures that the correct objects (correct size, shape etc.) are input to the robot timely and are placed at the right location so that the robot can pick it correctly, 3. The human dispatches the objects placed by the robot timely so that the locations where the robot needs to place the objects are always empty, 4. The human ensures that no unwanted persons and obstacles can come to the workspace, 5. The human ensures the robot s power supply, 6. The human also fixes jigs or fixtures if necessary, does other auxiliary works related to assembly, 7. The human changes the control commands through the software inputs, keyboard control or joystick control if necessary, 8. When the task is finished, the human either turns the robot off or transfers the robot to another task and teaches the new task to the robot again etc.

3 Thus, the human shares the assembly task and the manufacturing environments with the robot for its twofold roles (supervisor and co-worker). We select this assembly task because- 1. This type of assembly tasks is very common in most manufacturing environments and thus we believe that our model may be the most useful to many manufacturing companies e.g. automobile assembly. 2. Human-robot collaboration is very important and very clear for this type of assembly tasks. We believe that performances for human-robot collaboration for this task may be better than that for individual human or the robot because the manual task is burden to the human and the fully autonomous task is not flexible. Instead, human-robot collaboration reduces human s burdens and at the same time increases flexibility [21]-[23]. 3. We think that the emotions of the robot and the human may play significant roles for such assembly task etc. IV. EMOTION-BASED INTERACTIONS BETWEEN THE ROBOT AND THE HUMAN DURING THE TASK There are face-to-face communications when the human plays his/her supervisory and co-worker roles with the robot during the assembly task in the manufacturing environments as explained in section III. The face-to-face communications for the above task may produce three types of emotion-based interactions between the human and the robot as follows- 1. The robot s emotions are dynamically changed based on the task situations and the human can recognize the robot s emotions and is influenced by the robot s emotions, 2. The human s emotions are dynamically changed based on the task situations and the robot can recognize the human s emotions and is influenced by the human s emotions, 3. The human and the robot both can recognize the dynamically changed emotions of each other and are influenced by the emotions of each other (in this case, emotion-based bi-directional communications occur between the human and the robot, which is natural). This paper is confined to the above case#1 only where the robot s emotions are dynamically changed based on the task situations and the human can recognize the robot s emotions and is influenced by the robot s emotions. In this case, the robot evaluates the human s cooperation levels and displays its emotional reactions on its head screen. The emotional displays in the robot may be changed based on the task situations and on the human s cooperation levels. The human may be able to recognize the robot s emotions, understand and analyze the task situations as well as the robot s status and its understanding of the task, and adjust his /her cooperation levels so that he/she can provide the best cooperation to the robot counterpart. Different types of emotions may be displayed in the robot screen based on the task situations and on the human s Figure 3. Prospective emotions that the robot may display at various task situations and cooperation levels [14]. cooperation levels. The robot s neutral emotional state may be displayed when the robot is ready to receive the training from the human. The robot may show a concentrating emotion when it concentrates on the work with the human. The robot may be confused that may be reflected as emotion in the screen when the robot needs more inputs on the task. The robot may show surprised emotion when it is surprised e.g. when someone enters the work area unexpectedly. The robot may show sad emotion when it is on hold and awaits further instructions. The robot may show asleep when it is on standby. The robot may show a happy emotion when everything seems to be satisfactory to the robot. The robot may show an angry emotion when it becomes helpless, it sees no hope to proceed with the works, and so forth. Figure 3 shows the prospective emotions that the robot may display at various task situations and cooperation levels. V. INTELLIGENT CONTROL ALGORITHMS FOR THE DYNAMIC EMOTIONS Figure 4 shows the proposed intelligent control algorithms for dynamic emotions for the human-robot collaborative assembly. The algorithms associate dynamic emotions with the task described in Fig.2. At first, the robot shows the neutral emotion in its screen, which means that the robot is ready to receive training for the task. Then, the robot moves from the initial position to the location where the robot needs to pick the object (InitToPick function runs/playbacks that was previously taught to the robot by the human). The robot end effector includes an infrared rangefinder that is used to locate the edges of the object and determine height for picking the object from the surface (table).we assume that the correct object s height (h) is x with a tolerance of y. If the object (input by the human) is found within the correct height, the robot expresses happiness and then it shows the concentrating emotion and then picks the object and places at the specified location on the table (PnP function runs/playbacks that was

4 Start Neutral InitToPick Yes h No Happy Confused Concentrating Sad PnP Yes h No End Happy Concentrating Angry End PnP End Figure 4. The intelligent control algorithm for dynamic emotions for the human-robot collaborative assembly. previously taught to the robot by the human).then, the human places the second object on the first object that was picked and placed by the robot. However, if the object s height is not correct, then the robot becomes confused that makes it sad later. If the human can understand the problem and replace the wrong object by the correct object swiftly, then the robot becomes happy and concentrates on the work. Then, the robot picks the object and places it at the designated location on the table, and the human then puts the second object on the first object placed by the robot. However, if the correct object is still not found, the robot becomes angry. The robot picks the third object and places it on the second object in the similar way. As the next step, we will determine the dynamics of the robot emotions inspired by that of the human [16]-[20]. We will quantify the dynamic evolution of robot emotions through analytical mathematical equations. We then design suitable intelligent control algorithms based on the emotion dynamics to dynamically adjust the emotions of the robot with that of the human and with the task situations. VI. EXPERIMENTS We need to evaluate the effectiveness of the aforementioned control algorithms to produce dynamic emotions in the robot s head screen that are adjusted dynamically with that of the task situations. We also need to evaluate how the dynamic emotions of the robot make the human-robot interactions more effective that may be reflected through the enhancement in the assembly performances in terms of efficiency, productivity, costs, quality, safety etc. To understand the effects of the dynamic emotions of the robot on the manufacturing performances, we at first need to determine an objective evaluation scheme for each of the manufacturing performance criterion (efficiency, productivity, costs, quality and safety) for the assembly task. Then, we need to evaluate the manufacturing performance separately for the following three conditions: 1. Assembly task done in collaboration between the robot and the human with no emotion of the robot (in this condition, the emotion display screen of the robot will be either turned off or removed from the robot),

5 Mean evaluation score 2. Assembly task done in collaboration between the robot and the human with static emotion of the robot (in this condition, the emotion display screen of the robot will be turned on and the robot will be operated under its current static emotion abilities e.g., only the concentrating emotion will be displayed when the robot concentrates on the work), 3. Assembly task done in collaboration between the robot and the human with dynamic emotion of the robot (in this condition, the emotion display screen of the robot will be turned on and the robot will be operated with its dynamic emotion abilities based on the novel control algorithms depicted in Fig.4). If we compare the performances among the aforementioned three conditions, then we may be able to understand the effects of dynamic emotions on assembly performances. However, as an initial effort, we have conducted preliminary experiments for conditions #1 and 2 separately. For condition#1, eleven (11) subjects separately collaborated with the robot for the assembly task introduced in Fig.2, but there was no emotion of the robot. The experimenter subjectively evaluated the HRI for each subject separately based on the following two evaluation criteria using a rating scale between 1 (lowest) and 5 (highest): 1. Awareness: how aware the human was of the task situations (situational awareness) 2. Engagement: how engaged the human was with the robot during the task. The experimenter also subjectively evaluated the assembly performance of each subject using the same rating scale based on the following criterion: 1. Accuracy: how accurately (positional or placement accuracy) the human and the robot collaboratively assembled the objects. The similar procedures were followed for condition#2, but the robot had a single static emotion (concentration) displayed on the screen while working with the human. Figure 5 shows the experiment procedures for conditions#1 and 2. VII. EXPERIMENT RESULTS We determined the mean (n=11) values of the evaluation scores for each evaluation criterion separately for no emotion and static emotion condition as shown in Fig.6. The results show that the static emotion produces better HRI than the no emotion condition. The better HRI also produces better assembly performance for the static emotion condition. However, the reliability of the results will need to be increased by increasing the number of subjects, improving the experiment setup and procedures, improving the evaluation criteria and metrics etc. On the other hand, the static emotion may be proven worse than the no emotion condition if a wrong static emotion is displayed or if the static emotion does not always fit with the dynamic task situations, etc. (a) (b) Figure 5. (a) The human and robot collaboratively assemble three objects where the robot has no emotion, (b) the human and robot collaboratively assemble three objects where the robot has a static emotion (concentrating emotion) No emotion Static emotion Engagement Awareness Accuracy Evaluation criteria Figure 6. Mean (n=11) evaluation results with standard deviations for HRI and assembly performances for no emotion and a static emotion condition for the human-robot collaborative assembly task. VIII. CONCLUSIONS AND FUTURE WORKS We present the collaborative manufacturing robot Rethink Baxter. We also present an assembly task for manufacturing where the robot and the human collaboratively perform the assembly task. We have identified the situations where the emotions of the robot may affect the HRI and the manufacturing performances. We then proposed intelligent control algorithms to adjust the robot emotions dynamically with that of the task situations. As the proof of concepts, we also preliminarily investigated the effects of emotions on HRI and assembly performance through a no emotion versus static emotion experiment for the assembly task. The results

6 preliminarily show that the static emotion produces better HRI and assembly performance (on average) than that produced for the no emotion condition. In the near future, we will reinvestigate the effects of emotions on HRI and assembly performance through no emotion versus static emotion experiment using more subjects, better hypotheses and improved experiment methods and metrics. We will report the results of the effectiveness of the control algorithms in producing the dynamic emotions as well as of the effects of the dynamic emotions on the HRI and on the assembly performances in the manufacturing. We will determine the dynamics of the robot emotions inspired by that of the human and will quantify the dynamic evolution of robot emotions through analytical mathematical equations. We then further enrich the control algorithms based on the emotion dynamics and adjust the emotions of the robot with that of the human and with the task situations. ACKNOWLEDGEMENT The authors express thanks and gratitude to BMW Manufacturing Co., Spartanburg, SC , USA for supporting with the Baxter robot used for the research presented in this paper. The authors are also thankful to all the subjects who participated in the experiments. REFERENCES [1] D. Robinson, Brain function, mental experience and personality, The Netherlands Journal of Psychology, Vol.64, pp , [2] C. Breazeal, Emotion and sociable humanoid robots International Journal of Human-Computer Studies-Application of affective computing in human-computer interaction, Vol. 59, No. 1-2, 2003, pp [3] K. Malchus, P. Jaecks, O. Damm, P. Stenneken, C. Meyer, B. Wrede, The role of emotional congruence in human-robot interaction, in Proc. of th ACM/IEEE International Conference on Human-Robot Interaction (HRI), pp , 3-6 March [4] C. Breazeal, Sociable machines: expressive social exchange between humans and robots, Sc.D. Dissertation, Department of Electrical Engineering and Computer Science, MIT, [5] A. Breemen, X. Yan, and B. Meerbeek, icat: an animated userinterface robot with personality, in Proceedings of the Fourth International Joint Conference on Autonomous Agents and Multiagent Systems (AAMAS '05), ACM, NY, USA, pp , [6] S. Wachsmuth, S. Schulz, F. Lier, F. Siepmann, and I. Lutkebohle The Robot Head Flobi : A Research Platform for Cognitive Interaction Technology, S. Wölfl (Ed.): Poster and Demo Track of the 35th German Conference on Artificial Intelligence (KI-2012), pp. 3-7, [7] S. Rahman, Generating human-like social motion in a human-looking humanoid robot: The biomimetic approach, in Proc. of 2013 IEEE International Conference on Robotics and Biomimetics (ROBIO), pp , Dec [8] S. Nishio, H. Ishiguro, N. Hagita, Geminoid: teleoperated android of an existing person, Chapter in Humanoid Robots: New Developments, I- Tech Education and Publishing, Vienna, Austria, pp , June, [9] F. Eyssel, D. Kuchenbrandt, F. Hegel, and L. Ruiter, Activating elicited agent knowledge: how robot and user features shape the perception of social robots, in Proc. of the st IEEE Int. Symp. on Robot and Human Interactive Communication, pp [10] F. Hegel, S. Krach, T. Kircher, B. Wrede, Understanding social robots: a user study on anthropomorphism, in Proc. of 2008 IEEE Int. Symp. on Robot and Human Interactive Communication, pp [11] K. Kawamura, T. Rogers, K. Hambuchen, D. Erol, Towards a human robot symbiotic system, Robotics and Computer Integrated Manufacturing, Vol.19, pp , [12] C. Heyer, Human-robot interaction and future industrial robotics applications, Intelligent Robots and Systems (IROS), 2010 IEEE/RSJ International Conference on, pp , Oct [13] S. Nikolaidis, P. Lasota, G. Rossano, C. Martinez, T. Fuhlbrigge, and J. Shah, Human-robot collaboration in manufacturing: quantitative evaluation of predictable, convergent joint action, in Proc. of the 44th International Symposium on Robotics, [14] [15] [16] M. Nasr, Modeling emotion dynamics in intelligent agents, Master Thesis, Texas A & M University, USA, [17] J. Steephen, HED: a computational model of affective adaptation and emotion dynamics, IEEE Transactions on Affective Computing, Vol. 4, No. 2, pp , April-June [18] E. Schmidt and Y. Kim, Modeling musical emotion dynamics with conditional random fields, in Proc. of th International Society for Music Information Retrieval Conference (ISMIR 2011). [19] M. Belhaj, F. Kebair, and L. Said, A computational model of emotions for the simulation of human emotional dynamics in emergency situations, International Journal of Computer Theory and Engineering, Vol. 6, No. 3, pp , [20] D. Kim, P. Baranyi, Novel emotion dynamic express for robot, Applied Machine Intelligence and Informatics (SAMI), 2011 IEEE 9th International Symposium on, pp , Jan [21] S.M.M.Rahman, R. Ikeura, Weight-perception-based novel control of a power-assist robot for the cooperative lifting of light-weight objects, International Journal of Advanced Robotic Systems, Vol.9, No.118, 13 pages, Oct [22] S.M.M.Rahman, R. Ikeura, Improving interactions between a power assist robot system and its human user in horizontal transfer of objects using a novel adaptive control method, Advances in Human-Computer Interaction, Vol. 2012, Article ID , 12 pages, December [23] S.M.M.Rahman, R. Ikeura, M. Nobe, S. Hayakawa, H. Sawai, Weightperception-based model of power assist system for lifting objects, International Journal of Automation Technology- Special Issue on Robotic Technology to Extend Workers Physical Abilities and Skills, Vol.3, No.6, pp ,Nov

Non Verbal Communication of Emotions in Social Robots

Non Verbal Communication of Emotions in Social Robots Non Verbal Communication of Emotions in Social Robots Aryel Beck Supervisor: Prof. Nadia Thalmann BeingThere Centre, Institute for Media Innovation, Nanyang Technological University, Singapore INTRODUCTION

More information

Wednesday, October 29, :00-04:00pm EB: 3546D. TELEOPERATION OF MOBILE MANIPULATORS By Yunyi Jia Advisor: Prof.

Wednesday, October 29, :00-04:00pm EB: 3546D. TELEOPERATION OF MOBILE MANIPULATORS By Yunyi Jia Advisor: Prof. Wednesday, October 29, 2014 02:00-04:00pm EB: 3546D TELEOPERATION OF MOBILE MANIPULATORS By Yunyi Jia Advisor: Prof. Ning Xi ABSTRACT Mobile manipulators provide larger working spaces and more flexibility

More information

TRUST-BASED CONTROL AND MOTION PLANNING FOR MULTI-ROBOT SYSTEMS WITH A HUMAN-IN-THE-LOOP

TRUST-BASED CONTROL AND MOTION PLANNING FOR MULTI-ROBOT SYSTEMS WITH A HUMAN-IN-THE-LOOP TRUST-BASED CONTROL AND MOTION PLANNING FOR MULTI-ROBOT SYSTEMS WITH A HUMAN-IN-THE-LOOP Yue Wang, Ph.D. Warren H. Owen - Duke Energy Assistant Professor of Engineering Interdisciplinary & Intelligent

More information

Design of Silent Actuators using Shape Memory Alloy

Design of Silent Actuators using Shape Memory Alloy Design of Silent Actuators using Shape Memory Alloy Jaideep Upadhyay 1,2, Husain Khambati 1,2, David Pinto 1 1 Benemérita Universidad Autónoma de Puebla, Facultad de Ciencias de la Computación, Mexico

More information

Emotional BWI Segway Robot

Emotional BWI Segway Robot Emotional BWI Segway Robot Sangjin Shin https:// github.com/sangjinshin/emotional-bwi-segbot 1. Abstract The Building-Wide Intelligence Project s Segway Robot lacked emotions and personality critical in

More information

Cognitive robots and emotional intelligence Cloud robotics Ethical, legal and social issues of robotic Construction robots Human activities in many

Cognitive robots and emotional intelligence Cloud robotics Ethical, legal and social issues of robotic Construction robots Human activities in many Preface The jubilee 25th International Conference on Robotics in Alpe-Adria-Danube Region, RAAD 2016 was held in the conference centre of the Best Western Hotel M, Belgrade, Serbia, from 30 June to 2 July

More information

Jane Li. Assistant Professor Mechanical Engineering Department, Robotic Engineering Program Worcester Polytechnic Institute

Jane Li. Assistant Professor Mechanical Engineering Department, Robotic Engineering Program Worcester Polytechnic Institute Jane Li Assistant Professor Mechanical Engineering Department, Robotic Engineering Program Worcester Polytechnic Institute State one reason for investigating and building humanoid robot (4 pts) List two

More information

Touch Perception and Emotional Appraisal for a Virtual Agent

Touch Perception and Emotional Appraisal for a Virtual Agent Touch Perception and Emotional Appraisal for a Virtual Agent Nhung Nguyen, Ipke Wachsmuth, Stefan Kopp Faculty of Technology University of Bielefeld 33594 Bielefeld Germany {nnguyen, ipke, skopp}@techfak.uni-bielefeld.de

More information

MIN-Fakultät Fachbereich Informatik. Universität Hamburg. Socially interactive robots. Christine Upadek. 29 November Christine Upadek 1

MIN-Fakultät Fachbereich Informatik. Universität Hamburg. Socially interactive robots. Christine Upadek. 29 November Christine Upadek 1 Christine Upadek 29 November 2010 Christine Upadek 1 Outline Emotions Kismet - a sociable robot Outlook Christine Upadek 2 Denition Social robots are embodied agents that are part of a heterogeneous group:

More information

SIGVerse - A Simulation Platform for Human-Robot Interaction Jeffrey Too Chuan TAN and Tetsunari INAMURA National Institute of Informatics, Japan The

SIGVerse - A Simulation Platform for Human-Robot Interaction Jeffrey Too Chuan TAN and Tetsunari INAMURA National Institute of Informatics, Japan The SIGVerse - A Simulation Platform for Human-Robot Interaction Jeffrey Too Chuan TAN and Tetsunari INAMURA National Institute of Informatics, Japan The 29 th Annual Conference of The Robotics Society of

More information

A STUDY ON THE EMOTION ELICITING ALGORITHM AND FACIAL EXPRESSION FOR DESIGNING INTELLIGENT ROBOTS

A STUDY ON THE EMOTION ELICITING ALGORITHM AND FACIAL EXPRESSION FOR DESIGNING INTELLIGENT ROBOTS A STUDY ON THE EMOTION ELICITING ALGORITHM AND FACIAL EXPRESSION FOR DESIGNING INTELLIGENT ROBOTS Jeong-gun Choi, Kwang myung Oh, and Myung suk Kim Korea Advanced Institute of Science and Technology, Yu-seong-gu,

More information

Generating Personality Character in a Face Robot through Interaction with Human

Generating Personality Character in a Face Robot through Interaction with Human Generating Personality Character in a Face Robot through Interaction with Human F. Iida, M. Tabata and F. Hara Department of Mechanical Engineering Science University of Tokyo - Kagurazaka, Shinjuku-ku,

More information

Trust, Satisfaction and Frustration Measurements During Human-Robot Interaction Moaed A. Abd, Iker Gonzalez, Mehrdad Nojoumian, and Erik D.

Trust, Satisfaction and Frustration Measurements During Human-Robot Interaction Moaed A. Abd, Iker Gonzalez, Mehrdad Nojoumian, and Erik D. Trust, Satisfaction and Frustration Measurements During Human-Robot Interaction Moaed A. Abd, Iker Gonzalez, Mehrdad Nojoumian, and Erik D. Engeberg Department of Ocean &Mechanical Engineering and Department

More information

HAND-SHAPED INTERFACE FOR INTUITIVE HUMAN- ROBOT COMMUNICATION THROUGH HAPTIC MEDIA

HAND-SHAPED INTERFACE FOR INTUITIVE HUMAN- ROBOT COMMUNICATION THROUGH HAPTIC MEDIA HAND-SHAPED INTERFACE FOR INTUITIVE HUMAN- ROBOT COMMUNICATION THROUGH HAPTIC MEDIA RIKU HIKIJI AND SHUJI HASHIMOTO Department of Applied Physics, School of Science and Engineering, Waseda University 3-4-1

More information

User interface for remote control robot

User interface for remote control robot User interface for remote control robot Gi-Oh Kim*, and Jae-Wook Jeon ** * Department of Electronic and Electric Engineering, SungKyunKwan University, Suwon, Korea (Tel : +8--0-737; E-mail: gurugio@ece.skku.ac.kr)

More information

Promotion of self-disclosure through listening by robots

Promotion of self-disclosure through listening by robots Promotion of self-disclosure through listening by robots Takahisa Uchida Hideyuki Takahashi Midori Ban Jiro Shimaya, Yuichiro Yoshikawa Hiroshi Ishiguro JST ERATO Osaka University, JST ERATO Doshosya University

More information

An Agent-Based Architecture for an Adaptive Human-Robot Interface

An Agent-Based Architecture for an Adaptive Human-Robot Interface An Agent-Based Architecture for an Adaptive Human-Robot Interface Kazuhiko Kawamura, Phongchai Nilas, Kazuhiko Muguruma, Julie A. Adams, and Chen Zhou Center for Intelligent Systems Vanderbilt University

More information

Introduction to Robotics

Introduction to Robotics Introduction to Robotics Analysis, systems, Applications Saeed B. Niku Chapter 1 Fundamentals 1. Introduction Fig. 1.1 (a) A Kuhnezug truck-mounted crane Reprinted with permission from Kuhnezug Fordertechnik

More information

CONTROLLING METHODS AND CHALLENGES OF ROBOTIC ARM

CONTROLLING METHODS AND CHALLENGES OF ROBOTIC ARM CONTROLLING METHODS AND CHALLENGES OF ROBOTIC ARM Aniket D. Kulkarni *1, Dr.Sayyad Ajij D. *2 *1(Student of E&C Department, MIT Aurangabad, India) *2(HOD of E&C department, MIT Aurangabad, India) aniket2212@gmail.com*1,

More information

Designing Toys That Come Alive: Curious Robots for Creative Play

Designing Toys That Come Alive: Curious Robots for Creative Play Designing Toys That Come Alive: Curious Robots for Creative Play Kathryn Merrick School of Information Technologies and Electrical Engineering University of New South Wales, Australian Defence Force Academy

More information

Design and Control of the BUAA Four-Fingered Hand

Design and Control of the BUAA Four-Fingered Hand Proceedings of the 2001 IEEE International Conference on Robotics & Automation Seoul, Korea May 21-26, 2001 Design and Control of the BUAA Four-Fingered Hand Y. Zhang, Z. Han, H. Zhang, X. Shang, T. Wang,

More information

Associated Emotion and its Expression in an Entertainment Robot QRIO

Associated Emotion and its Expression in an Entertainment Robot QRIO Associated Emotion and its Expression in an Entertainment Robot QRIO Fumihide Tanaka 1. Kuniaki Noda 1. Tsutomu Sawada 2. Masahiro Fujita 1.2. 1. Life Dynamics Laboratory Preparatory Office, Sony Corporation,

More information

Learning Actions from Demonstration

Learning Actions from Demonstration Learning Actions from Demonstration Michael Tirtowidjojo, Matthew Frierson, Benjamin Singer, Palak Hirpara October 2, 2016 Abstract The goal of our project is twofold. First, we will design a controller

More information

A Practical Approach to Understanding Robot Consciousness

A Practical Approach to Understanding Robot Consciousness A Practical Approach to Understanding Robot Consciousness Kristin E. Schaefer 1, Troy Kelley 1, Sean McGhee 1, & Lyle Long 2 1 US Army Research Laboratory 2 The Pennsylvania State University Designing

More information

Emotion Sensitive Active Surfaces

Emotion Sensitive Active Surfaces Emotion Sensitive Active Surfaces Larissa Müller 1, Arne Bernin 1,4, Svenja Keune 2, and Florian Vogt 1,3 1 Department Informatik, University of Applied Sciences (HAW) Hamburg, Germany 2 Department Design,

More information

Evaluation of an Enhanced Human-Robot Interface

Evaluation of an Enhanced Human-Robot Interface Evaluation of an Enhanced Human-Robot Carlotta A. Johnson Julie A. Adams Kazuhiko Kawamura Center for Intelligent Systems Center for Intelligent Systems Center for Intelligent Systems Vanderbilt University

More information

STRATEGO EXPERT SYSTEM SHELL

STRATEGO EXPERT SYSTEM SHELL STRATEGO EXPERT SYSTEM SHELL Casper Treijtel and Leon Rothkrantz Faculty of Information Technology and Systems Delft University of Technology Mekelweg 4 2628 CD Delft University of Technology E-mail: L.J.M.Rothkrantz@cs.tudelft.nl

More information

Intelligent interaction

Intelligent interaction BionicWorkplace: autonomously learning workstation for human-machine collaboration Intelligent interaction Face to face, hand in hand. The BionicWorkplace shows the extent to which human-machine collaboration

More information

Real-time Adaptive Robot Motion Planning in Unknown and Unpredictable Environments

Real-time Adaptive Robot Motion Planning in Unknown and Unpredictable Environments Real-time Adaptive Robot Motion Planning in Unknown and Unpredictable Environments IMI Lab, Dept. of Computer Science University of North Carolina Charlotte Outline Problem and Context Basic RAMP Framework

More information

Cognitive Robotics 2017/2018

Cognitive Robotics 2017/2018 Cognitive Robotics 2017/2018 Course Introduction Matteo Matteucci matteo.matteucci@polimi.it Artificial Intelligence and Robotics Lab - Politecnico di Milano About me and my lectures Lectures given by

More information

Available online at ScienceDirect. Procedia Computer Science 76 (2015 )

Available online at   ScienceDirect. Procedia Computer Science 76 (2015 ) Available online at www.sciencedirect.com ScienceDirect Procedia Computer Science 76 (2015 ) 474 479 2015 IEEE International Symposium on Robotics and Intelligent Sensors (IRIS 2015) Sensor Based Mobile

More information

An Experimental Comparison of Path Planning Techniques for Teams of Mobile Robots

An Experimental Comparison of Path Planning Techniques for Teams of Mobile Robots An Experimental Comparison of Path Planning Techniques for Teams of Mobile Robots Maren Bennewitz Wolfram Burgard Department of Computer Science, University of Freiburg, 7911 Freiburg, Germany maren,burgard

More information

Evaluating Facial Expression Synthesis on Robots

Evaluating Facial Expression Synthesis on Robots (2013). "Evaluating Facial Expression Synthesis on Robots". In Proceedings of the HRI Workshop on Applications for Emotional Robots at the 8th ACM International Conference on Human-Robot Interaction (HRI).

More information

Human-Swarm Interaction

Human-Swarm Interaction Human-Swarm Interaction a brief primer Andreas Kolling irobot Corp. Pasadena, CA Swarm Properties - simple and distributed - from the operator s perspective - distributed algorithms and information processing

More information

Adaptive Humanoid Robot Arm Motion Generation by Evolved Neural Controllers

Adaptive Humanoid Robot Arm Motion Generation by Evolved Neural Controllers Proceedings of the 3 rd International Conference on Mechanical Engineering and Mechatronics Prague, Czech Republic, August 14-15, 2014 Paper No. 170 Adaptive Humanoid Robot Arm Motion Generation by Evolved

More information

Human-Robot Companionships. Mark Neerincx

Human-Robot Companionships. Mark Neerincx Human-Robot Companionships Mark Neerincx TNO and DUT Perceptual and Cognitive Systems Interactive Intelligence International User-Centred Robot R&D Delft Robotics Institute What is a robot? The word robot

More information

Wireless Robust Robots for Application in Hostile Agricultural. environment.

Wireless Robust Robots for Application in Hostile Agricultural. environment. Wireless Robust Robots for Application in Hostile Agricultural Environment A.R. Hirakawa, A.M. Saraiva, C.E. Cugnasca Agricultural Automation Laboratory, Computer Engineering Department Polytechnic School,

More information

CURRICULUM VITAE. Evan Drumwright EDUCATION PROFESSIONAL PUBLICATIONS

CURRICULUM VITAE. Evan Drumwright EDUCATION PROFESSIONAL PUBLICATIONS CURRICULUM VITAE Evan Drumwright 209 Dunn Hall The University of Memphis Memphis, TN 38152 Phone: 901-678-3142 edrmwrgh@memphis.edu http://cs.memphis.edu/ edrmwrgh EDUCATION Ph.D., Computer Science, May

More information

- Basics of informatics - Computer network - Software engineering - Intelligent media processing - Human interface. Professor. Professor.

- Basics of informatics - Computer network - Software engineering - Intelligent media processing - Human interface. Professor. Professor. - Basics of informatics - Computer network - Software engineering - Intelligent media processing - Human interface Computer-Aided Engineering Research of power/signal integrity analysis and EMC design

More information

Dipartimento di Elettronica Informazione e Bioingegneria Robotics

Dipartimento di Elettronica Informazione e Bioingegneria Robotics Dipartimento di Elettronica Informazione e Bioingegneria Robotics Behavioral robotics @ 2014 Behaviorism behave is what organisms do Behaviorism is built on this assumption, and its goal is to promote

More information

Children s age influences their perceptions of a humanoid robot as being like a person or machine.

Children s age influences their perceptions of a humanoid robot as being like a person or machine. Children s age influences their perceptions of a humanoid robot as being like a person or machine. Cameron, D., Fernando, S., Millings, A., Moore. R., Sharkey, A., & Prescott, T. Sheffield Robotics, The

More information

Evolutionary robotics Jørgen Nordmoen

Evolutionary robotics Jørgen Nordmoen INF3480 Evolutionary robotics Jørgen Nordmoen Slides: Kyrre Glette Today: Evolutionary robotics Why evolutionary robotics Basics of evolutionary optimization INF3490 will discuss algorithms in detail Illustrating

More information

Robot Personality based on the Equations of Emotion defined in the 3D Mental Space

Robot Personality based on the Equations of Emotion defined in the 3D Mental Space Proceedings of the 21 IEEE International Conference on Robotics & Automation Seoul, Korea May 2126, 21 Robot based on the Equations of Emotion defined in the 3D Mental Space Hiroyasu Miwa *, Tomohiko Umetsu

More information

Laser-Assisted Telerobotic Control for Enhancing Manipulation Capabilities of Persons with Disabilities

Laser-Assisted Telerobotic Control for Enhancing Manipulation Capabilities of Persons with Disabilities The 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems October 18-22, 2010, Taipei, Taiwan Laser-Assisted Telerobotic Control for Enhancing Manipulation Capabilities of Persons with

More information

Physical Human Robot Interaction

Physical Human Robot Interaction MIN Faculty Department of Informatics Physical Human Robot Interaction Intelligent Robotics Seminar Ilay Köksal University of Hamburg Faculty of Mathematics, Informatics and Natural Sciences Department

More information

NAVIGATION OF MOBILE ROBOT USING THE PSO PARTICLE SWARM OPTIMIZATION

NAVIGATION OF MOBILE ROBOT USING THE PSO PARTICLE SWARM OPTIMIZATION Journal of Academic and Applied Studies (JAAS) Vol. 2(1) Jan 2012, pp. 32-38 Available online @ www.academians.org ISSN1925-931X NAVIGATION OF MOBILE ROBOT USING THE PSO PARTICLE SWARM OPTIMIZATION Sedigheh

More information

Issues in Information Systems Volume 13, Issue 2, pp , 2012

Issues in Information Systems Volume 13, Issue 2, pp , 2012 131 A STUDY ON SMART CURRICULUM UTILIZING INTELLIGENT ROBOT SIMULATION SeonYong Hong, Korea Advanced Institute of Science and Technology, gosyhong@kaist.ac.kr YongHyun Hwang, University of California Irvine,

More information

Birth of An Intelligent Humanoid Robot in Singapore

Birth of An Intelligent Humanoid Robot in Singapore Birth of An Intelligent Humanoid Robot in Singapore Ming Xie Nanyang Technological University Singapore 639798 Email: mmxie@ntu.edu.sg Abstract. Since 1996, we have embarked into the journey of developing

More information

Safe and Efficient Autonomous Navigation in the Presence of Humans at Control Level

Safe and Efficient Autonomous Navigation in the Presence of Humans at Control Level Safe and Efficient Autonomous Navigation in the Presence of Humans at Control Level Klaus Buchegger 1, George Todoran 1, and Markus Bader 1 Vienna University of Technology, Karlsplatz 13, Vienna 1040,

More information

Prospective Teleautonomy For EOD Operations

Prospective Teleautonomy For EOD Operations Perception and task guidance Perceived world model & intent Prospective Teleautonomy For EOD Operations Prof. Seth Teller Electrical Engineering and Computer Science Department Computer Science and Artificial

More information

Interaction rule learning with a human partner based on an imitation faculty with a simple visuo-motor mapping

Interaction rule learning with a human partner based on an imitation faculty with a simple visuo-motor mapping Robotics and Autonomous Systems 54 (2006) 414 418 www.elsevier.com/locate/robot Interaction rule learning with a human partner based on an imitation faculty with a simple visuo-motor mapping Masaki Ogino

More information

On Application of Virtual Fixtures as an Aid for Telemanipulation and Training

On Application of Virtual Fixtures as an Aid for Telemanipulation and Training On Application of Virtual Fixtures as an Aid for Telemanipulation and Training Shahram Payandeh and Zoran Stanisic Experimental Robotics Laboratory (ERL) School of Engineering Science Simon Fraser University

More information

Android (Child android)

Android (Child android) Social and ethical issue Why have I developed the android? Hiroshi ISHIGURO Department of Adaptive Machine Systems, Osaka University ATR Intelligent Robotics and Communications Laboratories JST ERATO Asada

More information

TREE CLIMBING ROBOT (TREEBOT)

TREE CLIMBING ROBOT (TREEBOT) 9 JEST-M, Vol 4, Issue 4, Jan-2015 TREE CLIMBING ROBOT (TREEBOT) Electronics and Communication department, MVJ College of Engineering srivatsa12ster@gmail.com, vinoop.u@gmail.com, satish.mvjce@gmail.com,

More information

Using Computational Cognitive Models to Build Better Human-Robot Interaction. Cognitively enhanced intelligent systems

Using Computational Cognitive Models to Build Better Human-Robot Interaction. Cognitively enhanced intelligent systems Using Computational Cognitive Models to Build Better Human-Robot Interaction Alan C. Schultz Naval Research Laboratory Washington, DC Introduction We propose an approach for creating more cognitively capable

More information

Stabilize humanoid robot teleoperated by a RGB-D sensor

Stabilize humanoid robot teleoperated by a RGB-D sensor Stabilize humanoid robot teleoperated by a RGB-D sensor Andrea Bisson, Andrea Busatto, Stefano Michieletto, and Emanuele Menegatti Intelligent Autonomous Systems Lab (IAS-Lab) Department of Information

More information

CS 378: Autonomous Intelligent Robotics. Instructor: Jivko Sinapov

CS 378: Autonomous Intelligent Robotics. Instructor: Jivko Sinapov CS 378: Autonomous Intelligent Robotics Instructor: Jivko Sinapov http://www.cs.utexas.edu/~jsinapov/teaching/cs378/ Announcements Robotics Study Still going on... Readings for this week Stoytchev, Alexander.

More information

Robotics Introduction Matteo Matteucci

Robotics Introduction Matteo Matteucci Robotics Introduction About me and my lectures 2 Lectures given by Matteo Matteucci +39 02 2399 3470 matteo.matteucci@polimi.it http://www.deib.polimi.it/ Research Topics Robotics and Autonomous Systems

More information

Effects of Integrated Intent Recognition and Communication on Human-Robot Collaboration

Effects of Integrated Intent Recognition and Communication on Human-Robot Collaboration Effects of Integrated Intent Recognition and Communication on Human-Robot Collaboration Mai Lee Chang 1, Reymundo A. Gutierrez 2, Priyanka Khante 1, Elaine Schaertl Short 1, Andrea Lockerd Thomaz 1 Abstract

More information

Flexible Cooperation between Human and Robot by interpreting Human Intention from Gaze Information

Flexible Cooperation between Human and Robot by interpreting Human Intention from Gaze Information Proceedings of 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems September 28 - October 2, 2004, Sendai, Japan Flexible Cooperation between Human and Robot by interpreting Human

More information

ISMCR2004. Abstract. 2. The mechanism of the master-slave arm of Telesar II. 1. Introduction. D21-Page 1

ISMCR2004. Abstract. 2. The mechanism of the master-slave arm of Telesar II. 1. Introduction. D21-Page 1 Development of Multi-D.O.F. Master-Slave Arm with Bilateral Impedance Control for Telexistence Riichiro Tadakuma, Kiyohiro Sogen, Hiroyuki Kajimoto, Naoki Kawakami, and Susumu Tachi 7-3-1 Hongo, Bunkyo-ku,

More information

PICK AND PLACE HUMANOID ROBOT USING RASPBERRY PI AND ARDUINO FOR INDUSTRIAL APPLICATIONS

PICK AND PLACE HUMANOID ROBOT USING RASPBERRY PI AND ARDUINO FOR INDUSTRIAL APPLICATIONS PICK AND PLACE HUMANOID ROBOT USING RASPBERRY PI AND ARDUINO FOR INDUSTRIAL APPLICATIONS Bernard Franklin 1, Sachin.P 2, Jagadish.S 3, Shaista Noor 4, Rajashekhar C. Biradar 5 1,2,3,4,5 School of Electronics

More information

Understanding the Mechanism of Sonzai-Kan

Understanding the Mechanism of Sonzai-Kan Understanding the Mechanism of Sonzai-Kan ATR Intelligent Robotics and Communication Laboratories Where does the Sonzai-Kan, the feeling of one's presence, such as the atmosphere, the authority, come from?

More information

Evaluating Fluency in Human-Robot Collaboration

Evaluating Fluency in Human-Robot Collaboration Evaluating Fluency in Human-Robot Collaboration Guy Hoffman Media Innovation Lab, IDC Herzliya P.O. Box 167, Herzliya 46150, Israel Email: hoffman@idc.ac.il Abstract Collaborative fluency is the coordinated

More information

The Tele-operation of the Humanoid Robot -Whole Body Operation for Humanoid Robots in Contact with Environment-

The Tele-operation of the Humanoid Robot -Whole Body Operation for Humanoid Robots in Contact with Environment- The Tele-operation of the Humanoid Robot -Whole Body Operation for Humanoid Robots in Contact with Environment- Hitoshi Hasunuma, Kensuke Harada, and Hirohisa Hirukawa System Technology Development Center,

More information

COMP150 Behavior-Based Robotics

COMP150 Behavior-Based Robotics For class use only, do not distribute COMP150 Behavior-Based Robotics http://www.cs.tufts.edu/comp/150bbr/timetable.html http://www.cs.tufts.edu/comp/150bbr/syllabus.html Course Essentials This is not

More information

Proceedings of th IEEE-RAS International Conference on Humanoid Robots ! # Adaptive Systems Research Group, School of Computer Science

Proceedings of th IEEE-RAS International Conference on Humanoid Robots ! # Adaptive Systems Research Group, School of Computer Science Proceedings of 2005 5th IEEE-RAS International Conference on Humanoid Robots! # Adaptive Systems Research Group, School of Computer Science Abstract - A relatively unexplored question for human-robot social

More information

Live Feeling on Movement of an Autonomous Robot Using a Biological Signal

Live Feeling on Movement of an Autonomous Robot Using a Biological Signal Live Feeling on Movement of an Autonomous Robot Using a Biological Signal Shigeru Sakurazawa, Keisuke Yanagihara, Yasuo Tsukahara, Hitoshi Matsubara Future University-Hakodate, System Information Science,

More information

Robot: Geminoid F This android robot looks just like a woman

Robot: Geminoid F This android robot looks just like a woman ProfileArticle Robot: Geminoid F This android robot looks just like a woman For the complete profile with media resources, visit: http://education.nationalgeographic.org/news/robot-geminoid-f/ Program

More information

The Use of Social Robot Ono in Robot Assisted Therapy

The Use of Social Robot Ono in Robot Assisted Therapy The Use of Social Robot Ono in Robot Assisted Therapy Cesar Vandevelde 1, Jelle Saldien 1, Maria-Cristina Ciocci 1, Bram Vanderborght 2 1 Ghent University, Dept. of Industrial Systems and Product Design,

More information

Evolving High-Dimensional, Adaptive Camera-Based Speed Sensors

Evolving High-Dimensional, Adaptive Camera-Based Speed Sensors In: M.H. Hamza (ed.), Proceedings of the 21st IASTED Conference on Applied Informatics, pp. 1278-128. Held February, 1-1, 2, Insbruck, Austria Evolving High-Dimensional, Adaptive Camera-Based Speed Sensors

More information

Multi-Agent Planning

Multi-Agent Planning 25 PRICAI 2000 Workshop on Teams with Adjustable Autonomy PRICAI 2000 Workshop on Teams with Adjustable Autonomy Position Paper Designing an architecture for adjustably autonomous robot teams David Kortenkamp

More information

Natural Interaction with Social Robots

Natural Interaction with Social Robots Workshop: Natural Interaction with Social Robots Part of the Topig Group with the same name. http://homepages.stca.herts.ac.uk/~comqkd/tg-naturalinteractionwithsocialrobots.html organized by Kerstin Dautenhahn,

More information

Distributed Vision System: A Perceptual Information Infrastructure for Robot Navigation

Distributed Vision System: A Perceptual Information Infrastructure for Robot Navigation Distributed Vision System: A Perceptual Information Infrastructure for Robot Navigation Hiroshi Ishiguro Department of Information Science, Kyoto University Sakyo-ku, Kyoto 606-01, Japan E-mail: ishiguro@kuis.kyoto-u.ac.jp

More information

Robot Task-Level Programming Language and Simulation

Robot Task-Level Programming Language and Simulation Robot Task-Level Programming Language and Simulation M. Samaka Abstract This paper presents the development of a software application for Off-line robot task programming and simulation. Such application

More information

Robotics for Children

Robotics for Children Vol. xx No. xx, pp.1 8, 200x 1 1 2 3 4 Robotics for Children New Directions in Child Education and Therapy Fumihide Tanaka 1,HidekiKozima 2, Shoji Itakura 3 and Kazuo Hiraki 4 Robotics intersects with

More information

Ensuring the Safety of an Autonomous Robot in Interaction with Children

Ensuring the Safety of an Autonomous Robot in Interaction with Children Machine Learning in Robot Assisted Therapy Ensuring the Safety of an Autonomous Robot in Interaction with Children Challenges and Considerations Stefan Walke stefan.walke@tum.de SS 2018 Overview Physical

More information

Robot: icub This humanoid helps us study the brain

Robot: icub This humanoid helps us study the brain ProfileArticle Robot: icub This humanoid helps us study the brain For the complete profile with media resources, visit: http://education.nationalgeographic.org/news/robot-icub/ Program By Robohub Tuesday,

More information

Mobile Manipulation in der Telerobotik

Mobile Manipulation in der Telerobotik Mobile Manipulation in der Telerobotik Angelika Peer, Thomas Schauß, Ulrich Unterhinninghofen, Martin Buss angelika.peer@tum.de schauss@tum.de ulrich.unterhinninghofen@tum.de mb@tum.de Lehrstuhl für Steuerungs-

More information

Research Article Humanoid Robot Head Design Based on Uncanny Valley and FACS

Research Article Humanoid Robot Head Design Based on Uncanny Valley and FACS Robotics, Article ID 208924, 5 pages http://dx.doi.org/10.1155/2014/208924 Research Article Humanoid Robot Head Design Based on Uncanny Valley and FACS Jizheng Yan, 1 Zhiliang Wang, 2 and Yan Yan 2 1 SchoolofAutomationandElectricalEngineering,UniversityofScienceandTechnologyBeijing,Beijing100083,China

More information

Evolutionary Computation and Machine Intelligence

Evolutionary Computation and Machine Intelligence Evolutionary Computation and Machine Intelligence Prabhas Chongstitvatana Chulalongkorn University necsec 2005 1 What is Evolutionary Computation What is Machine Intelligence How EC works Learning Robotics

More information

ROBO-PARTNER: Safe human-robot collaboration for assembly: case studies and challenges

ROBO-PARTNER: Safe human-robot collaboration for assembly: case studies and challenges ROBO-PARTNER: Safe human-robot collaboration for assembly: case studies and challenges Dr. George Michalos University of Patras ROBOT FORUM ASSEMBLY 16 March 2016 Parma, Italy Introduction Human sensitivity

More information

Summary of robot visual servo system

Summary of robot visual servo system Abstract Summary of robot visual servo system Xu Liu, Lingwen Tang School of Mechanical engineering, Southwest Petroleum University, Chengdu 610000, China In this paper, the survey of robot visual servoing

More information

Chapter 1 Introduction to Robotics

Chapter 1 Introduction to Robotics Chapter 1 Introduction to Robotics PS: Most of the pages of this presentation were obtained and adapted from various sources in the internet. 1 I. Definition of Robotics Definition (Robot Institute of

More information

Rapid Development System for Humanoid Vision-based Behaviors with Real-Virtual Common Interface

Rapid Development System for Humanoid Vision-based Behaviors with Real-Virtual Common Interface Rapid Development System for Humanoid Vision-based Behaviors with Real-Virtual Common Interface Kei Okada 1, Yasuyuki Kino 1, Fumio Kanehiro 2, Yasuo Kuniyoshi 1, Masayuki Inaba 1, Hirochika Inoue 1 1

More information

Levels of Description: A Role for Robots in Cognitive Science Education

Levels of Description: A Role for Robots in Cognitive Science Education Levels of Description: A Role for Robots in Cognitive Science Education Terry Stewart 1 and Robert West 2 1 Department of Cognitive Science 2 Department of Psychology Carleton University In this paper,

More information

Homeostasis Lighting Control System Using a Sensor Agent Robot

Homeostasis Lighting Control System Using a Sensor Agent Robot Intelligent Control and Automation, 2013, 4, 138-153 http://dx.doi.org/10.4236/ica.2013.42019 Published Online May 2013 (http://www.scirp.org/journal/ica) Homeostasis Lighting Control System Using a Sensor

More information

LASA I PRESS KIT lasa.epfl.ch I EPFL-STI-IMT-LASA Station 9 I CH 1015, Lausanne, Switzerland

LASA I PRESS KIT lasa.epfl.ch I EPFL-STI-IMT-LASA Station 9 I CH 1015, Lausanne, Switzerland LASA I PRESS KIT 2016 LASA I OVERVIEW LASA (Learning Algorithms and Systems Laboratory) at EPFL, focuses on machine learning applied to robot control, humanrobot interaction and cognitive robotics at large.

More information

Body Movement Analysis of Human-Robot Interaction

Body Movement Analysis of Human-Robot Interaction Body Movement Analysis of Human-Robot Interaction Takayuki Kanda, Hiroshi Ishiguro, Michita Imai, and Tetsuo Ono ATR Intelligent Robotics & Communication Laboratories 2-2-2 Hikaridai, Seika-cho, Soraku-gun,

More information

Development of Human-Robot Interaction Systems for Humanoid Robots

Development of Human-Robot Interaction Systems for Humanoid Robots Development of Human-Robot Interaction Systems for Humanoid Robots Bruce A. Maxwell, Brian Leighton, Andrew Ramsay Colby College {bmaxwell,bmleight,acramsay}@colby.edu Abstract - Effective human-robot

More information

Cognitive Robotics 2016/2017

Cognitive Robotics 2016/2017 Cognitive Robotics 2016/2017 Course Introduction Matteo Matteucci matteo.matteucci@polimi.it Artificial Intelligence and Robotics Lab - Politecnico di Milano About me and my lectures Lectures given by

More information

Chapter 1 Introduction

Chapter 1 Introduction Chapter 1 Introduction It is appropriate to begin the textbook on robotics with the definition of the industrial robot manipulator as given by the ISO 8373 standard. An industrial robot manipulator is

More information

This is a repository copy of Designing robot personalities for human-robot symbiotic interaction in an educational context.

This is a repository copy of Designing robot personalities for human-robot symbiotic interaction in an educational context. This is a repository copy of Designing robot personalities for human-robot symbiotic interaction in an educational context. White Rose Research Online URL for this paper: http://eprints.whiterose.ac.uk/102874/

More information

* Intelli Robotic Wheel Chair for Specialty Operations & Physically Challenged

* Intelli Robotic Wheel Chair for Specialty Operations & Physically Challenged ADVANCED ROBOTICS SOLUTIONS * Intelli Mobile Robot for Multi Specialty Operations * Advanced Robotic Pick and Place Arm and Hand System * Automatic Color Sensing Robot using PC * AI Based Image Capturing

More information

Reactive Planning with Evolutionary Computation

Reactive Planning with Evolutionary Computation Reactive Planning with Evolutionary Computation Chaiwat Jassadapakorn and Prabhas Chongstitvatana Intelligent System Laboratory, Department of Computer Engineering Chulalongkorn University, Bangkok 10330,

More information

Motion Control of Excavator with Tele-Operated System

Motion Control of Excavator with Tele-Operated System 26th International Symposium on Automation and Robotics in Construction (ISARC 2009) Motion Control of Excavator with Tele-Operated System Dongnam Kim 1, Kyeong Won Oh 2, Daehie Hong 3#, Yoon Ki Kim 4

More information

Humanoid Robots. by Julie Chambon

Humanoid Robots. by Julie Chambon Humanoid Robots by Julie Chambon 25th November 2008 Outlook Introduction Why a humanoid appearance? Particularities of humanoid Robots Utility of humanoid Robots Complexity of humanoids Humanoid projects

More information

Nobutsuna Endo 1, Shimpei Momoki 1, Massimiliano Zecca 2,3, Minoru Saito 1, Yu Mizoguchi 1, Kazuko Itoh 3,5, and Atsuo Takanishi 2,4,5

Nobutsuna Endo 1, Shimpei Momoki 1, Massimiliano Zecca 2,3, Minoru Saito 1, Yu Mizoguchi 1, Kazuko Itoh 3,5, and Atsuo Takanishi 2,4,5 2008 IEEE International Conference on Robotics and Automation Pasadena, CA, USA, May 19-23, 2008 Development of Whole-body Emotion Expression Humanoid Robot Nobutsuna Endo 1, Shimpei Momoki 1, Massimiliano

More information

Tele-manipulation of a satellite mounted robot by an on-ground astronaut

Tele-manipulation of a satellite mounted robot by an on-ground astronaut Proceedings of the 2001 IEEE International Conference on Robotics & Automation Seoul, Korea May 21-26, 2001 Tele-manipulation of a satellite mounted robot by an on-ground astronaut M. Oda, T. Doi, K. Wakata

More information

DEVELOPMENT OF AN ARTIFICIAL DYNAMIC FACE APPLIED TO AN AFFECTIVE ROBOT

DEVELOPMENT OF AN ARTIFICIAL DYNAMIC FACE APPLIED TO AN AFFECTIVE ROBOT DEVELOPMENT OF AN ARTIFICIAL DYNAMIC FACE APPLIED TO AN AFFECTIVE ROBOT ALVARO SANTOS 1, CHRISTIANE GOULART 2, VINÍCIUS BINOTTE 3, HAMILTON RIVERA 3, CARLOS VALADÃO 3, TEODIANO BASTOS 2, 3 1. Assistive

More information