INTERACTIVE EMOTION COMMUNICATION BETWEEN HUMAN AND ROBOT. Received February 2010; revised August 2010
|
|
- Wilfrid Sparks
- 5 years ago
- Views:
Transcription
1 International Journal of Innovative Computing, Information and Control ICIC International c 2 ISSN Volume 7, Number 5(B), May 2 pp INTERACTIVE EMOTION COMMUNICATION BETWEEN HUMAN AND ROBOT Yoichiro Maeda and Ryohei Taki 2 Graduate School of Engineering 2 Faculty of Engineering University of Fukui 3-9-, Bunkyo, Fukui City, Fukui Prefecture, 9-857, Japan { maeda; rtaki }@ir.his.u-fukui.ac.jp Received February 2; revised August 2 Abstract. In this paper, we aim to realize the bidirectional communication that a human and a robot perform the face to face interaction based on the behavior with emotion. A model of Interactive Emotion Communication (IEC) is proposed in this paper. This model is a kind of interactive process that a robot infers human emotions from human behavior and generates robot emotions. In addition, we evaluated the impression for emotional behavior of pet-type robot in the experiment of human-robot interaction. Keywords: Emotion, Communication, Interaction, Fuzzy inference. Introduction. Recently, opportunities when a robot contacts human are increasing, therefore, the technology for the interactive communication with human is gradually needed. In addition, the flexible understanding ability of human intension and the expressing ability of robot intention are required for the robot to live together. Technology for realizing the interactive communication between human and robot has not been established yet, therefore, there are few robots which communicate to human smoothly. In order to understand the human intension and express the robot intention, some researches used the nonverbal information have been proposed [, 2, 3, 4]. If there are difference between the verbal and nonverbal communication when we convey our emotion and attitude, nonverbal communication includes over 9% information for the emotion of interlocutor. There are various kinds of nonverbal communication that is eye sign, voice, expression, gesture, and so on. On the other hand, we feel the unpleasantness for the robot expression like human. We have been performed the research on the nonverbal communication based on human and robot behavior. Because a robot does not have various abilities to express own intention and can perform only the restrictive way of motions, sounds and so on, the communication between human and robot is generally very difficult. For this reason, we proposed a method Interactive Emotion Communication to communicate through emotional behavior. By this method, we try to realize the interaction that the human and robot enable to communicate smoothly through emotions. In this research, the method of emotion inference from the human behavior is used [5]. At first, the body feature of a subject is extracted based on the Laban s theory. Next, we obtain the basic emotional degree by fuzzy inference using extracted human body feature. Finally, the emotion value of human behavior is evaluated based on the Russell s Circumplex Model. In this research, we aim to realize Interactive Emotion Communication (we call IEC) which is a bidirectional communication based on the emotional behavior between human and robot. 296
2 2962 Y. MAEDA AND R. TAKI Step I'm sad. [Recognize Emotion] Step2 He seems to be sad. Interactive Emotion Communication [Generate Emotion] Step4 I'm happy. Step3 I will make him happy. [Express Emotion] Figure. Interactive emotion communication (IEC) () Input (2) (3) (4) Output (5) Human Russell's Laban's Motion Analysis Fuzzy Inference Robot Motion Circumplex Model Motion Measurement Body Feature Basic Emotional Degree Emotion Behavior Figure 2. Fuzzy emotion inference system (FEIS) 2. Interactive Emotion Communication. This research assumes the bidirectional communication model through emotional behavior between human and robot as one example of several nonverbal communication. The emotional behavior means the gesture or dance to represent emotion to an opposite person. We call this bidirectional communication Interactive Emotion Communication (IEC). Figure shows the conceptual drawing of IEC. We assume that there are two persons A and B here. First of all, A generates something emotion and expresses emotional behavior to B by his gesture (Step ). Next, B recognizes A s emotion by his vision ability (Step 2). B expresses emotional behavior to A with B s emotion (Step 3). In this way, the interactive communication by emotional behavior is constructed between A and B. Assuming that B is a robot, the following three processes become necessary.. Recognizing human s emotion. 2. Generating robot s emotion. 3. Expressing robot s emotion. There are many recent researches to understand human emotion [6, 7] or express robot emotion [8], however, we could find very few references regarding the interaction between the human with various ways of expression and the robot with only restrictive way of expression. In this research, we proposed a new trial of the communication between a human and a robot with their behavior through emotions. We try to investigate the ability of the proposed method. Our final goal is to build the robot which is able to recognize human s emotion and express its emotion by the bidirectional communication based on IEC model with high interpersonal affinity. 3. Fuzzy Emotion Inference System based on Laban s Theory. Mainly in this chapter, we explain the first one of above-mentioned IEC recognizing human s emotion process which inferences human s emotion from emotional behavior of human.
3 IEC BETWEEN HUMAN AND ROBOT 2963 Table. Classification of effort (quoted from [9]) Motion Element Effort Measurable Aspects Classifiable Aspects Weight Firm Gentle Resistance Levity Time Sudden Sustained Speed Duration Space Direct Flexible Direction Expansion Flow Bound Free Control Fluency Table 2. Classification of Shape (quoted from [9]) Plane of Form Fighting Form Indulging Form Table Plane Enclosing Spreading Door Plane Ascending Descending Wheel Plane Retreating Advancing 3.. Fuzzy emotion inference system algorithm. Figure 2 shows the procedure of Fuzzy Emotion Inference System (we call FEIS) proposed in this research. FEIS is proposed with the following algorithm. () Measuring human emotional behavior using a CCD camera. (2) Extracting body features from the motion analysis based on Laban s Theory. (3) Calculating the basic emotional degree by the fuzzy inference using body features. (4) Obtaining the emotion value using Russell s Circumplex Model based on the basic emotional degree. (5) Expressing robot emotional behavior based on the emotion value. The emotion used in this research into four basic emotions (Joy: JOY, Anger: ANG, Sadness: SAD, Relaxation: REL). In the following sections, we discuss human s and robot s emotions only about these four types of emotions Motion analysis based on Laban s theory. Laban s theory [9] proposed by R. Laban is a method to extract the macro features from human body motions. This method has three types of description about motion features, that is, Effort-Shape Description, Motif Description and Structural Description. Above all Effort-Shape Description describes the quality of motions and meaning of the expression. Because it is useful when the human body motion is classified according to the visual function, we use this type of description to analyze human body motions. R. Laban proposed the theory that there are a bipolar system based on Fighting Form and Indulging Form in the expression of human body motion. Fighting From means active and vivid body motion and Indulging Form means slow and gentle body motion. The concept that human motions are subdivided by these forms is Effort-Shape Description. Tables and 2 show the classification of Effort and Shape. Effort is effective to classify the body motion based on Kansei information. Shape shows the feature of overall static shape of the body motion, moreover, Shape do not include considering local motion feature. In this research, we suppose that Time Effort is the speed of the body s center of gravity, Flow Effort is the hand acceleration, Table-Plane
4 2964 Y. MAEDA AND R. TAKI AS AM AL PL PM PH VS VM VF HS HM HL s s2 s3 s4 p p2 p3 p4 v v2 v3 v4 h h2 h3 h4 (a) Area : La (b) Position : Lp (c) Velocity : Lv (d) Hand Acceleraion : Lh NUL NUM NUS NEU PPS PPM PPL NSL NSM NSS NEU PAS PAM PAL Rx nu3 nu2 nu pp pp2 pp3 (e) Pleasure and Unpleasure : Rx ns3 ns2 ns pa pa2 pa3 (f) Arousal and Sleep : Ry Ry Figure 3. Membership functions and singletons Table 3. Values of membership functions and singltons La Lp Lv Lh s=5 p=85 v=5 h=2 s2=3 p2=7 v2= h2=45 s3=45 p3=2 v3=25 h3=9 s4=7 p4=3 v4=5 h4=2 Rx Ry pp= nu= pa= ns= pp2=2 nu2= 2 pa2=2 ns2= 2 pp3=3 nu3= 3 pa3=3 ns3= 3 Shape is the body area and Door-Plane Shape is the height of the body s center of gravity Fuzzy emotion inference rule. Figure 3, Tables 3 and 4 show membership functions, singletons and fuzzy rules used in FEIS of this research. Tanabe et al. proposed the basic theory of this system [5]. The basic emotional degrees extracted from the motion analysis based on Laban s Theory as input values of fuzzy inference are defined in this system. Values of Pleasure and Unpleasure and Arousal and Sleep axis are decided based on the rule of Table 4 so that the system obtains an emotion value on Russell s Circumplex Model Russell s Circumplex model. J. A. Russell in 98 proposed the Circumplex Model [] that all emotions are expressed by the circumplex model on the plane defined by two dimensions: Pleasure and Unpleasure and Arousal and Sleep. Additional proposal by Witvliet and Vrana [] which four basic emotions apply to each quadrant of this model is proposed. Therefore, we also defined the human emotion by using these four basic emotions in each quadrant as JOY, ANG, SAD and REL (See Figure 4). In this research, the human emotion is inferred from R x (Pleasure and Unpleasure) and R y (Arousal and Sleep) obtained by FEIS. We decide the human emotion based on the quadrant which the inference results (R x, R y ) are belonging. The emotion value (E i : i = JOY, ANG, SAD, REL) means an emotional strength in this method. E i is calculated from Equations () and (2). E i = Rx 2 + Ry 2 sin(π 2θ) () θ = arctan R y R x (2)
5 VS VM VF HS HM HL HS HM HL HS HM HL IEC BETWEEN HUMAN AND ROBOT 2965 Table 4. Fuzzy emotion inference rule AS AM AL PL PM PH PL PM PH PL PM PH NUS NEU NEU NEU NEU PPL NEU PPL PPL NSL NSM NEU NSL NSL NSM NSL NSL NSL NUM NUS PPM NEU NEU PPL NEU NEU PPL NSM NEU PAS NSL NSM NEU NSL NSL NSM NUM NUM NUS NEU NEU PPM NEU PPM PPL NSS PAS PAM NSM NEU PAS NSL NSM NSM NUM NUS PPS NEU PPM PPM PPM PPL PPL NSS NSS PAS NSM NSM PAS NSL NSM NEU NUL NEU NUS NUM NEU PPM PPM PPL PPL NSS NEU PAM NEU NEU NEU NSM NEU PAS NUL NUL NUM NUL NUM NEU NUS PPS PPM PAS PAM PAL NSS PAM PAM NSS PAS PAS NUL NUM NUM NUM NEU NEU NUS PPM PPL PAM PAM PAL NSS NEU PAM NSM NSL PAS NUL NUL NUM NUL NEU NUS NUM PPS PPM PAM PAL PAL NEU PAM PAL NSL PAS PAM NUL NUL NUL NUL NUL NUM NUL NUM PPS PAL PAL PAL PAM PAL PAL PAS PAM PAL (Upper Label: Rx, Lower Label: Ry) ANGER Unpleasure TENSE DISTRESSED ANNOYED FRUSTRATED SADNESS ALARMED AROUSED AFRAID ANGRY Arousal Sleep EXCITED ASTONISHED JOY DELIGHTED GLAD HAPPY PLEASED SARENE MISERABLE CALM DEPRESSED AT EASE SAD RELAXED GLOOMY BORED SLEEPY DROOPY TIRED Ry O (Rx, Ry) Rx Pleasure SATISFIED CONTENT RELAXATION Figure 4. Basic emotions on Russell s circumplex model (quoted from []) JOY ANG i = SAD REL θ < π π θ < π 2 2 π θ < 3π 3π θ < 2π 2 2
6 2966 Y. MAEDA AND R. TAKI Figure 5. Experimental environment 4. Experiment. We were able to infer rough human emotions by the above-mentioned FEIS, additionally attempt the interactive experiment between human and pet-type robot. In this experiment, we define each basic emotional behavior expressed by human as JOY- H, ANG-H, SAD-H and REL-H, and each basic emotional behavior expressed by robot as JOY-R, ANG-R, SAD-R and REL-R. 4.. Experimental environment. We used AIBO (SONY ERS-7) as a pet-type robot in this experiment. The program of AIBO is freely read and written by the personal computer through the exclusive memory stick. The joint is 2 degrees-of-freedom in total. The environment of this experiment is shown in Figure 5. FEIS was constructed on the another computer with the web camera in this experiment because huge image data (3 to 5 frames/sec) of human s expression was processed. However, in the next stage, we have a plan to realize the face to face communication between human and robot Precondition of experiment. This experiment was performed by cooperating of 22 years old student in our laboratory as a subject. Subjects were attached five markers with different colors on his head, both hands and both feet to extract body features. We made a subject express emotional behavior freely without time limit. In this research, at first, emotional behavior of subject is measured by the web camera connected with the personal computer (See Figure 5). Next, the human emotions from emotional behavior are recognized by FEIS. Furthermore, the result of FEIS is sent to the robot through the wireless LAN. Finally, the robot expresses emotional behavior according to the result of FEIS. Emotional behavior expressed by the robot were restricted to one motion of 4 patterns (JOY-R, ANG-R, SAD-R, REL-R), and the expression time of each behavior was spent within 3 to 6 seconds. We tried to perform the following four kinds of experiment. () Random Reaction: Robot expresses random emotional behavior without the relationship to the result of FEIS. (2) Echo Reaction: Robot expresses each emotional behavior with same relationship to the result of FEIS, that is, robot expresses as same emotion as human expressed. (3) Contrary Reaction: Although robot expresses each emotional behavior with opposite relationship to the result of FEIS, that is, robot expresses the inverse emotion set on the origin symmetry in Figure 4. (4) Variable Reaction: Robot expresses emotional behavior taught by observer without relationship to the result of FEIS. Robot performs emotional behavior by observer. However, robot expresses an inverse emotion at the probability of 5%. For example, the inverse emotion means that robot expresses SAD-R if the result of FEIS is JOY-H, and robot expresses REL-R if the result of FEIS is ANG-H.
7 IEC BETWEEN HUMAN AND ROBOT 2967 Table 5. Communication time and order of likability Reaction Communication Order of Time (sec) Likability Random Reaction 6 2 Echo Reaction 9 Contrary Reaction 36 4 Variable Reaction 7 3 Subject person observes the robot in front of him as well as expressing emotional behavior, and the questionnaire on his impression hold in each robot reaction was investigated after the experiment. We performed the experiment in order of () Random Reaction to (4) Variable Reaction. Beforehand the subject has known which emotion each emotional behavior expressed, so as to the subject is able to understand robot s emotion Experimental results. Table 5 shows the result of communication between human and robot. The communication time shows the time interval when the subject was communicating with the robot, in other words, the time interval until getting tired. And the order of likability means the order which the subject prefers each reaction. We think that long communication time shows good communication comparatively, because we communicate for a long time with favorite friends. () Random Reaction experiment shows an incoherent type of emotional behavior and timing, whereas the evaluation of Random Reaction reached high value contrary to our expectations. Random Reaction s experimental result is shown in Figure 6 as solid lines. This graph shows outputs of FEIS and motion timings to express emotional behavior by the robot. X axis on these graphs means time of the experiment and y axis means emotion values (E i ). In this paper, we divide the experimental result to four graphs ((a) JOY, (b) ANG, (c) SAD and (d) REL). Gray area shows the time period when the subject performed his emotion behavior in this experiment, moreover, the timing when the robot generated its emotion is shown with broken lines. Because AIBO expressed sometimes same emotion that a subject expresses, we think the subject believed that AIBO expressed emotion according with the result of FEIS. The impression that a robot was acting by himself independently was given to the subject according to the questionnaire. Robot expressed JOY-R frequently when the subject expressed SAD-H. We think that these robot reactions connect with good impression. (2) Echo Reaction s experimental result is shown in Figure 7. In this experiment, because FEIS was able to infer the original human emotion accurately, the robot expressed its emotion according as human s behavior. Echo Reaction experiment gave the impression that this robot was considerably more intellectual and most familiar to the human, nevertheless, the communication time about 9 seconds was shortest than any other reactions. This is because the subject was bored with simple patterns that robot repeated the same emotion of human s. (3) Contrary Reaction experiment gave an impression estranged from the robot, therefore, the impression of this experiment was the worst. Contrary Reaction s experimental result is shown in Figure 8. The reason of this result is that only one emotional behavior of the robot was selected for an emotional behavior of the human in case of reactions of (2) and (3). For example, the robot expressed only SAD-R when the human expressed JOY-H in reaction (3). However, the two reactions were significantly different in the order of likability. This reason is because negative robot emotion (SAD-R and ANG-R) in case of positive human emotion (JOY-H and REL-H) was worse impression than positive robot
8 2968 Y. MAEDA AND R. TAKI (a)joy (b)ang (c)sad (d)rel (a)joy (b)ang (c)sad (d)rel Figure 6. FEIS result (Case (): Random reaction) Figure 7. FEIS result (Case (2): Echo reaction) emotion (JOY-R and REL-R) in case of negative human emotion (SAD-H and ANG-H). The communication time was comparatively short 36 seconds. This reason is because the subject noticed that robot intentionally expressed the inverse emotion. (4) Variable Reaction experiment gave an impression that it was the most complicated emotion expressed to the subject and secondly bad impression. Variable Reaction s experimental result is shown in Figure 9. The subject was going to convey his emotion desperately, therefore the communication time was the longest in all other experiment.
9 IEC BETWEEN HUMAN AND ROBOT (a)joy (a)joy (b)ang (b)ang (c)sad (c)sad (d)rel (d)rel Figure 8. FEIS result (Case (3): Contrary reaction) Figure 9. FEIS result (Case (4): Variable reaction) The subject did not feel close friendship, but he also was given the impression that this reaction was the most interesting Remarks. The subject said that the communication experiment with the robot was interesting, so we were able to confirm that the communication with the robot is useful and attracts human s interaction. Echo Reaction experiment had a high evaluation value, however, it was early to be tired of him so communication time was very short. This reaction is still far from practicality and necessary to be improved.
10 297 Y. MAEDA AND R. TAKI By contrast the subject tried to communicate in the experiment of Contrary Reaction and Variable Reaction although they were gotten the impression which are not familiar. He answered that he wanted the robot to express the same emotion in the questionnaire, therefore, we could confirm the effect of communicating actively. This result is useful when we consider about robot suitable reactions. 5. Conclusion. In this paper, we constructed a basic system that a robot communicates with a human based on IEC, moreover inspected the impression for the robot which performed emotional behavior in actual. In consequence, the difference of robot reaction influenced to human s impression and speed to be tired. We must consider what emotion the most natural reaction to generate for human s emotion is, and what reaction the highest interpersonal affinity gives. In this research, the system inferred the human emotion from emotional behavior by the computer to gather many images of human behavior. However, in the future, we must construct the system which all processes are performed in the robot. Furthermore, the parameter tuning of fuzzy rules takes a lot of time to adapt for each subject. Therefore, we must develop the system which is easy to construct fuzzy rules even in case of the experiment with many subjects. Acknowledgment. This research was partially supported by the Ministry of Education, Science, Sports and Culture, Grant-in-Aid for Scientific Research (C), 29-2, REFERENCES [] K. Itoh, H. Miwa, M. Matsumoto, M. Zecca et al., Various emotion expressions with emotion expression humanoid robot WE-4RII, Proc. of the st IEEE Technical Exhibition Based Conference on Robotics and Automation, pp.35-36, 24. [2] A. Bruce, I. Nourbakhsh and R. Simmons, The role of expressiveness and attention in humanrobot interaction, Proc. of 22 IEEE International Conference on Robotics and Automation, vol.4, pp , 22. [3] P. Y. Oudeyer, The production and recognition of emotions in speech: Features and algorithms, International Journal of Human-Computer Studies, vol.62, pp.57-83, 23. [4] M. Kanoh, S. Iwata, S. Kato and H. Itoh, Emotive facial expressions of sensitivity communication robot Ifbot, Kansei Engineering International, vol.5, no.3, pp.35-42, 25. [5] N. Tanabe and Y. Maeda, Emotional behavior evaluation method used fuzzy reasoning for pet-type robot, Human and Artificial Intelligence Systems From Control to Autonomy (HART), pp , 24. [6] J. Minato, K. Matsumoto, F. Ren, S. Tsuchiya and S. Kuroiwa, Evaluation of emotion estimation methods based on statistic features of emotion tagged corpus, International Journal of Innovative Computing, Information and Control, vol.4, no.8, pp.93-94, 28. [7] Y. Oyama and Y. Narita, A proposal for automatic analysis of emotions using facial charts, International Journal of Innovative Computing, Information and Control, vol.5, no.3, pp , 29. [8] T. Nomura and A. Nakao, Comparison on identification of affective body motions by robots between elder people and university students: A case study in Japan, International Journal of Social Robotics, vol.2, pp.47-57, 2. [9] R. Laban, The Mastery of Movement, Plays, Inc., 97. [] J. A. Russell, A Circumplex model of affect, Journal of Personality and Social Psychology, vol.39, pp.6-78, 98. [] C. V. O. Witvliet and S. R. Vrana, Psychophysiological responses as indices of affective dimensions, Psychophysiology, vol.32, pp , 995.
Effective Emotional Model of Pet-type Robot in Interactive Emotion Communication
SCIS & ISIS 200, Dec. 8-2, 200, Okayama Convention Center, Okayama, Japan Effective Emotional Model of Pet-type Robot in Interactive Emotion Communication Ryohei Taki, Yoichiro Maeda and Yasutake Takahashi
More informationIntent Expression Using Eye Robot for Mascot Robot System
Intent Expression Using Eye Robot for Mascot Robot System Yoichi Yamazaki, Fangyan Dong, Yuta Masuda, Yukiko Uehara, Petar Kormushev, Hai An Vu, Phuc Quang Le, and Kaoru Hirota Department of Computational
More informationMood-transition-based Emotion Generation Model for the Robot s Personality
Proceedings of the 2009 IEEE International Conference on Systems, an, and Cybernetics San Antonio, TX, USA - October 2009 ood-transition-based Emotion Generation odel for the Robot s Personality Chika
More informationUnderstanding the Mechanism of Sonzai-Kan
Understanding the Mechanism of Sonzai-Kan ATR Intelligent Robotics and Communication Laboratories Where does the Sonzai-Kan, the feeling of one's presence, such as the atmosphere, the authority, come from?
More informationExpression of Emotion and Intention by Robot Body Movement
Expression of Emotion and Intention by Robot Body Movement Toru NAKATA, Tomomasa SATO and Taketoshi MORI. Sato Lab., RCAST, Univ. of Tokyo, Komaba 4-6-1, Meguro, Tokyo, 153-8904, JAPAN. Abstract. A framework
More informationFlexible Cooperation between Human and Robot by interpreting Human Intention from Gaze Information
Proceedings of 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems September 28 - October 2, 2004, Sendai, Japan Flexible Cooperation between Human and Robot by interpreting Human
More informationVishnu Nath. Usage of computer vision and humanoid robotics to create autonomous robots. (Ximea Currera RL04C Camera Kit)
Vishnu Nath Usage of computer vision and humanoid robotics to create autonomous robots (Ximea Currera RL04C Camera Kit) Acknowledgements Firstly, I would like to thank Ivan Klimkovic of Ximea Corporation,
More informationAssociated Emotion and its Expression in an Entertainment Robot QRIO
Associated Emotion and its Expression in an Entertainment Robot QRIO Fumihide Tanaka 1. Kuniaki Noda 1. Tsutomu Sawada 2. Masahiro Fujita 1.2. 1. Life Dynamics Laboratory Preparatory Office, Sony Corporation,
More informationHandling Emotions in Human-Computer Dialogues
Handling Emotions in Human-Computer Dialogues Johannes Pittermann Angela Pittermann Wolfgang Minker Handling Emotions in Human-Computer Dialogues ABC Johannes Pittermann Universität Ulm Inst. Informationstechnik
More informationAutonomic gaze control of avatars using voice information in virtual space voice chat system
Autonomic gaze control of avatars using voice information in virtual space voice chat system Kinya Fujita, Toshimitsu Miyajima and Takashi Shimoji Tokyo University of Agriculture and Technology 2-24-16
More informationHomeostasis Lighting Control System Using a Sensor Agent Robot
Intelligent Control and Automation, 2013, 4, 138-153 http://dx.doi.org/10.4236/ica.2013.42019 Published Online May 2013 (http://www.scirp.org/journal/ica) Homeostasis Lighting Control System Using a Sensor
More informationGenerating Personality Character in a Face Robot through Interaction with Human
Generating Personality Character in a Face Robot through Interaction with Human F. Iida, M. Tabata and F. Hara Department of Mechanical Engineering Science University of Tokyo - Kagurazaka, Shinjuku-ku,
More informationREBO: A LIFE-LIKE UNIVERSAL REMOTE CONTROL
World Automation Congress 2010 TSI Press. REBO: A LIFE-LIKE UNIVERSAL REMOTE CONTROL SEIJI YAMADA *1 AND KAZUKI KOBAYASHI *2 *1 National Institute of Informatics / The Graduate University for Advanced
More informationHAND-SHAPED INTERFACE FOR INTUITIVE HUMAN- ROBOT COMMUNICATION THROUGH HAPTIC MEDIA
HAND-SHAPED INTERFACE FOR INTUITIVE HUMAN- ROBOT COMMUNICATION THROUGH HAPTIC MEDIA RIKU HIKIJI AND SHUJI HASHIMOTO Department of Applied Physics, School of Science and Engineering, Waseda University 3-4-1
More informationCS295-1 Final Project : AIBO
CS295-1 Final Project : AIBO Mert Akdere, Ethan F. Leland December 20, 2005 Abstract This document is the final report for our CS295-1 Sensor Data Management Course Final Project: Project AIBO. The main
More informationA Study on Gaze Estimation System using Cross-Channels Electrooculogram Signals
, March 12-14, 2014, Hong Kong A Study on Gaze Estimation System using Cross-Channels Electrooculogram Signals Mingmin Yan, Hiroki Tamura, and Koichi Tanno Abstract The aim of this study is to present
More informationBackground Pixel Classification for Motion Detection in Video Image Sequences
Background Pixel Classification for Motion Detection in Video Image Sequences P. Gil-Jiménez, S. Maldonado-Bascón, R. Gil-Pita, and H. Gómez-Moreno Dpto. de Teoría de la señal y Comunicaciones. Universidad
More informationBody Movement Analysis of Human-Robot Interaction
Body Movement Analysis of Human-Robot Interaction Takayuki Kanda, Hiroshi Ishiguro, Michita Imai, and Tetsuo Ono ATR Intelligent Robotics & Communication Laboratories 2-2-2 Hikaridai, Seika-cho, Soraku-gun,
More informationReal-Time Bilateral Control for an Internet-Based Telerobotic System
708 Real-Time Bilateral Control for an Internet-Based Telerobotic System Jahng-Hyon PARK, Joonyoung PARK and Seungjae MOON There is a growing tendency to use the Internet as the transmission medium of
More informationImplications on Humanoid Robots in Pedagogical Applications from Cross-Cultural Analysis between Japan, Korea, and the USA
Implications on Humanoid Robots in Pedagogical Applications from Cross-Cultural Analysis between Japan, Korea, and the USA Tatsuya Nomura,, No Member, Takayuki Kanda, Member, IEEE, Tomohiro Suzuki, No
More informationHUMAN-ROBOT INTERACTION
HUMAN-ROBOT INTERACTION (NO NATURAL LANGUAGE) 5. EMOTION EXPRESSION ANDREA BONARINI ARTIFICIAL INTELLIGENCE A ND ROBOTICS LAB D I P A R T M E N T O D I E L E T T R O N I C A, I N F O R M A Z I O N E E
More informationSensor system of a small biped entertainment robot
Advanced Robotics, Vol. 18, No. 10, pp. 1039 1052 (2004) VSP and Robotics Society of Japan 2004. Also available online - www.vsppub.com Sensor system of a small biped entertainment robot Short paper TATSUZO
More informationA SURVEY OF SOCIALLY INTERACTIVE ROBOTS
A SURVEY OF SOCIALLY INTERACTIVE ROBOTS Terrence Fong, Illah Nourbakhsh, Kerstin Dautenhahn Presented By: Mehwish Alam INTRODUCTION History of Social Robots Social Robots Socially Interactive Robots Why
More informationHuman Pointing Navigation Interface for Mobile Robot with Spherical Vision System
Paper: Human Pointing Navigation Interface for Mobile Robot with Spherical Vision System Yasutake Takahashi, Kyohei Yoshida, Fuminori Hibino, and Yoichiro Maeda Dept. of Human and Artificial Intelligent
More informationInforming a User of Robot s Mind by Motion
Informing a User of Robot s Mind by Motion Kazuki KOBAYASHI 1 and Seiji YAMADA 2,1 1 The Graduate University for Advanced Studies 2-1-2 Hitotsubashi, Chiyoda, Tokyo 101-8430 Japan kazuki@grad.nii.ac.jp
More informationMIN-Fakultät Fachbereich Informatik. Universität Hamburg. Socially interactive robots. Christine Upadek. 29 November Christine Upadek 1
Christine Upadek 29 November 2010 Christine Upadek 1 Outline Emotions Kismet - a sociable robot Outlook Christine Upadek 2 Denition Social robots are embodied agents that are part of a heterogeneous group:
More informationNon Verbal Communication of Emotions in Social Robots
Non Verbal Communication of Emotions in Social Robots Aryel Beck Supervisor: Prof. Nadia Thalmann BeingThere Centre, Institute for Media Innovation, Nanyang Technological University, Singapore INTRODUCTION
More informationMotion Control of a Three Active Wheeled Mobile Robot and Collision-Free Human Following Navigation in Outdoor Environment
Proceedings of the International MultiConference of Engineers and Computer Scientists 2016 Vol I,, March 16-18, 2016, Hong Kong Motion Control of a Three Active Wheeled Mobile Robot and Collision-Free
More informationOpen Access Partial Discharge Fault Decision and Location of 24kV Composite Porcelain Insulator based on Power Spectrum Density Algorithm
Send Orders for Reprints to reprints@benthamscience.ae 342 The Open Electrical & Electronic Engineering Journal, 15, 9, 342-346 Open Access Partial Discharge Fault Decision and Location of 24kV Composite
More informationLearning and Using Models of Kicking Motions for Legged Robots
Learning and Using Models of Kicking Motions for Legged Robots Sonia Chernova and Manuela Veloso Computer Science Department Carnegie Mellon University Pittsburgh, PA 15213 {soniac, mmv}@cs.cmu.edu Abstract
More informationSIGVerse - A Simulation Platform for Human-Robot Interaction Jeffrey Too Chuan TAN and Tetsunari INAMURA National Institute of Informatics, Japan The
SIGVerse - A Simulation Platform for Human-Robot Interaction Jeffrey Too Chuan TAN and Tetsunari INAMURA National Institute of Informatics, Japan The 29 th Annual Conference of The Robotics Society of
More informationCooperative Transportation by Humanoid Robots Learning to Correct Positioning
Cooperative Transportation by Humanoid Robots Learning to Correct Positioning Yutaka Inoue, Takahiro Tohge, Hitoshi Iba Department of Frontier Informatics, Graduate School of Frontier Sciences, The University
More informationInteractive System for Origami Creation
Interactive System for Origami Creation Takashi Terashima, Hiroshi Shimanuki, Jien Kato, and Toyohide Watanabe Graduate School of Information Science, Nagoya University Furo-cho, Chikusa-ku, Nagoya 464-8601,
More informationREALIZATION OF TAI-CHI MOTION USING A HUMANOID ROBOT Physical interactions with humanoid robot
REALIZATION OF TAI-CHI MOTION USING A HUMANOID ROBOT Physical interactions with humanoid robot Takenori Wama 1, Masayuki Higuchi 1, Hajime Sakamoto 2, Ryohei Nakatsu 1 1 Kwansei Gakuin University, School
More information3D-Position Estimation for Hand Gesture Interface Using a Single Camera
3D-Position Estimation for Hand Gesture Interface Using a Single Camera Seung-Hwan Choi, Ji-Hyeong Han, and Jong-Hwan Kim Department of Electrical Engineering, KAIST, Gusung-Dong, Yusung-Gu, Daejeon, Republic
More informationYue Bao Graduate School of Engineering, Tokyo City University
World of Computer Science and Information Technology Journal (WCSIT) ISSN: 2221-0741 Vol. 8, No. 1, 1-6, 2018 Crack Detection on Concrete Surfaces Using V-shaped Features Yoshihiro Sato Graduate School
More informationThe Tele-operation of the Humanoid Robot -Whole Body Operation for Humanoid Robots in Contact with Environment-
The Tele-operation of the Humanoid Robot -Whole Body Operation for Humanoid Robots in Contact with Environment- Hitoshi Hasunuma, Kensuke Harada, and Hirohisa Hirukawa System Technology Development Center,
More informationHMM-based Error Recovery of Dance Step Selection for Dance Partner Robot
27 IEEE International Conference on Robotics and Automation Roma, Italy, 1-14 April 27 ThA4.3 HMM-based Error Recovery of Dance Step Selection for Dance Partner Robot Takahiro Takeda, Yasuhisa Hirata,
More informationISMCR2004. Abstract. 2. The mechanism of the master-slave arm of Telesar II. 1. Introduction. D21-Page 1
Development of Multi-D.O.F. Master-Slave Arm with Bilateral Impedance Control for Telexistence Riichiro Tadakuma, Kiyohiro Sogen, Hiroyuki Kajimoto, Naoki Kawakami, and Susumu Tachi 7-3-1 Hongo, Bunkyo-ku,
More informationAn Intuitional Method for Mobile Robot Path-planning in a Dynamic Environment
An Intuitional Method for Mobile Robot Path-planning in a Dynamic Environment Ching-Chang Wong, Hung-Ren Lai, and Hui-Chieh Hou Department of Electrical Engineering, Tamkang University Tamshui, Taipei
More informationInteraction rule learning with a human partner based on an imitation faculty with a simple visuo-motor mapping
Robotics and Autonomous Systems 54 (2006) 414 418 www.elsevier.com/locate/robot Interaction rule learning with a human partner based on an imitation faculty with a simple visuo-motor mapping Masaki Ogino
More informationAutonomous Stair Climbing Algorithm for a Small Four-Tracked Robot
Autonomous Stair Climbing Algorithm for a Small Four-Tracked Robot Quy-Hung Vu, Byeong-Sang Kim, Jae-Bok Song Korea University 1 Anam-dong, Seongbuk-gu, Seoul, Korea vuquyhungbk@yahoo.com, lovidia@korea.ac.kr,
More informationNobutsuna Endo 1, Shimpei Momoki 1, Massimiliano Zecca 2,3, Minoru Saito 1, Yu Mizoguchi 1, Kazuko Itoh 3,5, and Atsuo Takanishi 2,4,5
2008 IEEE International Conference on Robotics and Automation Pasadena, CA, USA, May 19-23, 2008 Development of Whole-body Emotion Expression Humanoid Robot Nobutsuna Endo 1, Shimpei Momoki 1, Massimiliano
More informationReading human relationships from their interaction with an interactive humanoid robot
Reading human relationships from their interaction with an interactive humanoid robot Takayuki Kanda 1 and Hiroshi Ishiguro 1,2 1 ATR, Intelligent Robotics and Communication Laboratories 2-2-2 Hikaridai
More informationConverting Motion between Different Types of Humanoid Robots Using Genetic Algorithms
Converting Motion between Different Types of Humanoid Robots Using Genetic Algorithms Mari Nishiyama and Hitoshi Iba Abstract The imitation between different types of robots remains an unsolved task for
More informationEssay on A Survey of Socially Interactive Robots Authors: Terrence Fong, Illah Nourbakhsh, Kerstin Dautenhahn Summarized by: Mehwish Alam
1 Introduction Essay on A Survey of Socially Interactive Robots Authors: Terrence Fong, Illah Nourbakhsh, Kerstin Dautenhahn Summarized by: Mehwish Alam 1.1 Social Robots: Definition: Social robots are
More informationToward an Augmented Reality System for Violin Learning Support
Toward an Augmented Reality System for Violin Learning Support Hiroyuki Shiino, François de Sorbier, and Hideo Saito Graduate School of Science and Technology, Keio University, Yokohama, Japan {shiino,fdesorbi,saito}@hvrl.ics.keio.ac.jp
More informationExperimental Investigation into Influence of Negative Attitudes toward Robots on Human Robot Interaction
Experimental Investigation into Influence of Negative Attitudes toward Robots on Human Robot Interaction Tatsuya Nomura 1,2 1 Department of Media Informatics, Ryukoku University 1 5, Yokotani, Setaohe
More informationThe Classification of Gun s Type Using Image Recognition Theory
International Journal of Information and Electronics Engineering, Vol. 4, No. 1, January 214 The Classification of s Type Using Image Recognition Theory M. L. Kulthon Kasemsan Abstract The research aims
More informationA NOVEL IMAGE PROCESSING TECHNIQUE TO EXTRACT FACIAL EXPRESSIONS FROM MOUTH REGIONS
A NOVEL IMAGE PROCESSING TECHNIQUE TO EXTRACT FACIAL EXPRESSIONS FROM MOUTH REGIONS S.Sowmiya 1, Dr.K.Krishnaveni 2 1 Student, Department of Computer Science 2 1, 2 Associate Professor, Department of Computer
More informationEstimating Group States for Interactive Humanoid Robots
Estimating Group States for Interactive Humanoid Robots Masahiro Shiomi, Kenta Nohara, Takayuki Kanda, Hiroshi Ishiguro, and Norihiro Hagita Abstract In human-robot interaction, interactive humanoid robots
More informationContext Aware Computing
Context Aware Computing Context aware computing: the use of sensors and other sources of information about a user s context to provide more relevant information and services Context independent: acts exactly
More informationText Emotion Detection using Neural Network
International Journal of Engineering Research and Technology. ISSN 0974-3154 Volume 7, Number 2 (2014), pp. 153-159 International Research Publication House http://www.irphouse.com Text Emotion Detection
More information* Intelli Robotic Wheel Chair for Specialty Operations & Physically Challenged
ADVANCED ROBOTICS SOLUTIONS * Intelli Mobile Robot for Multi Specialty Operations * Advanced Robotic Pick and Place Arm and Hand System * Automatic Color Sensing Robot using PC * AI Based Image Capturing
More informationThe Autonomous Performance Improvement of Mobile Robot using Type-2 Fuzzy Self-Tuning PID Controller
, pp.182-187 http://dx.doi.org/10.14257/astl.2016.138.37 The Autonomous Performance Improvement of Mobile Robot using Type-2 Fuzzy Self-Tuning PID Controller Sang Hyuk Park 1, Ki Woo Kim 1, Won Hyuk Choi
More informationAutomatic Licenses Plate Recognition System
Automatic Licenses Plate Recognition System Garima R. Yadav Dept. of Electronics & Comm. Engineering Marathwada Institute of Technology, Aurangabad (Maharashtra), India yadavgarima08@gmail.com Prof. H.K.
More informationNear Infrared Face Image Quality Assessment System of Video Sequences
2011 Sixth International Conference on Image and Graphics Near Infrared Face Image Quality Assessment System of Video Sequences Jianfeng Long College of Electrical and Information Engineering Hunan University
More informationSystem of Recognizing Human Action by Mining in Time-Series Motion Logs and Applications
The 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems October 18-22, 2010, Taipei, Taiwan System of Recognizing Human Action by Mining in Time-Series Motion Logs and Applications
More informationAn Experimental Comparison of Path Planning Techniques for Teams of Mobile Robots
An Experimental Comparison of Path Planning Techniques for Teams of Mobile Robots Maren Bennewitz Wolfram Burgard Department of Computer Science, University of Freiburg, 7911 Freiburg, Germany maren,burgard
More informationDrink Bottle Defect Detection Based on Machine Vision Large Data Analysis. Yuesheng Wang, Hua Li a
Advances in Computer Science Research, volume 6 International Conference on Artificial Intelligence and Engineering Applications (AIEA 06) Drink Bottle Defect Detection Based on Machine Vision Large Data
More informationRoboCup: Not Only a Robotics Soccer Game but also a New Market Created for Future
RoboCup: Not Only a Robotics Soccer Game but also a New Market Created for Future Kuo-Yang Tu Institute of Systems and Control Engineering National Kaohsiung First University of Science and Technology
More informationBirth of An Intelligent Humanoid Robot in Singapore
Birth of An Intelligent Humanoid Robot in Singapore Ming Xie Nanyang Technological University Singapore 639798 Email: mmxie@ntu.edu.sg Abstract. Since 1996, we have embarked into the journey of developing
More informationAdaptive Humanoid Robot Arm Motion Generation by Evolved Neural Controllers
Proceedings of the 3 rd International Conference on Mechanical Engineering and Mechatronics Prague, Czech Republic, August 14-15, 2014 Paper No. 170 Adaptive Humanoid Robot Arm Motion Generation by Evolved
More informationRobot Personality from Perceptual Behavior Engine : An Experimental Study
Robot Personality from Perceptual Behavior Engine : An Experimental Study Dongwook Shin, Jangwon Lee, Hun-Sue Lee and Sukhan Lee School of Information and Communication Engineering Sungkyunkwan University
More informationImproving the Safety and Efficiency of Roadway Maintenance Phase II: Developing a Vision Guidance System for the Robotic Roadway Message Painter
Improving the Safety and Efficiency of Roadway Maintenance Phase II: Developing a Vision Guidance System for the Robotic Roadway Message Painter Final Report Prepared by: Ryan G. Rosandich Department of
More informationEmotion Based Music Player
ISSN 2278 0211 (Online) Emotion Based Music Player Nikhil Zaware Tejas Rajgure Amey Bhadang D. D. Sapkal Professor, Department of Computer Engineering, Pune, India Abstract: Facial expression provides
More informationLearning and Using Models of Kicking Motions for Legged Robots
Learning and Using Models of Kicking Motions for Legged Robots Sonia Chernova and Manuela Veloso Computer Science Department Carnegie Mellon University Pittsburgh, PA 15213 {soniac, mmv}@cs.cmu.edu Abstract
More informationTHE DEVELOPMENT of domestic and service robots has
1290 IEEE TRANSACTIONS ON CYBERNETICS, VOL. 43, NO. 4, AUGUST 2013 Robotic Emotional Expression Generation Based on Mood Transition and Personality Model Meng-Ju Han, Chia-How Lin, and Kai-Tai Song, Member,
More informationAn Improved Path Planning Method Based on Artificial Potential Field for a Mobile Robot
BULGARIAN ACADEMY OF SCIENCES CYBERNETICS AND INFORMATION TECHNOLOGIES Volume 15, No Sofia 015 Print ISSN: 1311-970; Online ISSN: 1314-4081 DOI: 10.1515/cait-015-0037 An Improved Path Planning Method Based
More informationRobotic Systems ECE 401RB Fall 2007
The following notes are from: Robotic Systems ECE 401RB Fall 2007 Lecture 14: Cooperation among Multiple Robots Part 2 Chapter 12, George A. Bekey, Autonomous Robots: From Biological Inspiration to Implementation
More informationArtificial Beacons with RGB-D Environment Mapping for Indoor Mobile Robot Localization
Sensors and Materials, Vol. 28, No. 6 (2016) 695 705 MYU Tokyo 695 S & M 1227 Artificial Beacons with RGB-D Environment Mapping for Indoor Mobile Robot Localization Chun-Chi Lai and Kuo-Lan Su * Department
More informationLimits of a Distributed Intelligent Networked Device in the Intelligence Space. 1 Brief History of the Intelligent Space
Limits of a Distributed Intelligent Networked Device in the Intelligence Space Gyula Max, Peter Szemes Budapest University of Technology and Economics, H-1521, Budapest, Po. Box. 91. HUNGARY, Tel: +36
More informationFacial Caricaturing Robot COOPER in EXPO 2005
Facial Caricaturing Robot COOPER in EXPO 2005 Takayuki Fujiwara, Takashi Watanabe, Takuma Funahashi, Hiroyasu Koshimizu and Katsuya Suzuki School of Information Sciences and Technology Chukyo University
More informationNCCT IEEE PROJECTS ADVANCED ROBOTICS SOLUTIONS. Latest Projects, in various Domains. Promise for the Best Projects
NCCT Promise for the Best Projects IEEE PROJECTS in various Domains Latest Projects, 2009-2010 ADVANCED ROBOTICS SOLUTIONS EMBEDDED SYSTEM PROJECTS Microcontrollers VLSI DSP Matlab Robotics ADVANCED ROBOTICS
More informationEnergy-Efficient Mobile Robot Exploration
Energy-Efficient Mobile Robot Exploration Abstract Mobile robots can be used in many applications, including exploration in an unknown area. Robots usually carry limited energy so energy conservation is
More informationOptic Flow Based Skill Learning for A Humanoid to Trap, Approach to, and Pass a Ball
Optic Flow Based Skill Learning for A Humanoid to Trap, Approach to, and Pass a Ball Masaki Ogino 1, Masaaki Kikuchi 1, Jun ichiro Ooga 1, Masahiro Aono 1 and Minoru Asada 1,2 1 Dept. of Adaptive Machine
More informationBODILY NON-VERBAL INTERACTION WITH VIRTUAL CHARACTERS
KEER2010, PARIS MARCH 2-4 2010 INTERNATIONAL CONFERENCE ON KANSEI ENGINEERING AND EMOTION RESEARCH 2010 BODILY NON-VERBAL INTERACTION WITH VIRTUAL CHARACTERS Marco GILLIES *a a Department of Computing,
More informationGroup Robots Forming a Mechanical Structure - Development of slide motion mechanism and estimation of energy consumption of the structural formation -
Proceedings 2003 IEEE International Symposium on Computational Intelligence in Robotics and Automation July 16-20, 2003, Kobe, Japan Group Robots Forming a Mechanical Structure - Development of slide motion
More informationHow Many Pixels Do We Need to See Things?
How Many Pixels Do We Need to See Things? Yang Cai Human-Computer Interaction Institute, School of Computer Science, Carnegie Mellon University, 5000 Forbes Avenue, Pittsburgh, PA 15213, USA ycai@cmu.edu
More informationAn Evaluation of Automatic License Plate Recognition Vikas Kotagyale, Prof.S.D.Joshi
An Evaluation of Automatic License Plate Recognition Vikas Kotagyale, Prof.S.D.Joshi Department of E&TC Engineering,PVPIT,Bavdhan,Pune ABSTRACT: In the last decades vehicle license plate recognition systems
More informationA Communication Model for Inter-vehicle Communication Simulation Systems Based on Properties of Urban Areas
IJCSNS International Journal of Computer Science and Network Security, VO.6 No.10, October 2006 3 A Communication Model for Inter-vehicle Communication Simulation Systems Based on Properties of Urban Areas
More informationPhysical and Affective Interaction between Human and Mental Commit Robot
Proceedings of the 21 IEEE International Conference on Robotics & Automation Seoul, Korea May 21-26, 21 Physical and Affective Interaction between Human and Mental Commit Robot Takanori Shibata Kazuo Tanie
More informationDevelopment of Drum CVT for a Wire-Driven Robot Hand
The 009 IEEE/RSJ International Conference on Intelligent Robots and Systems October 11-15, 009 St. Louis, USA Development of Drum CVT for a Wire-Driven Robot Hand Kojiro Matsushita, Shinpei Shikanai, and
More informationTele-Nursing System with Realistic Sensations using Virtual Locomotion Interface
6th ERCIM Workshop "User Interfaces for All" Tele-Nursing System with Realistic Sensations using Virtual Locomotion Interface Tsutomu MIYASATO ATR Media Integration & Communications 2-2-2 Hikaridai, Seika-cho,
More informationMEASUREMENT OF ROUGHNESS USING IMAGE PROCESSING. J. Ondra Department of Mechanical Technology Military Academy Brno, Brno, Czech Republic
MEASUREMENT OF ROUGHNESS USING IMAGE PROCESSING J. Ondra Department of Mechanical Technology Military Academy Brno, 612 00 Brno, Czech Republic Abstract: A surface roughness measurement technique, based
More informationKeywords: Immediate Response Syndrome, Artificial Intelligence (AI), robots, Social Networking Service (SNS) Introduction
Psychology Research, January 2018, Vol. 8, No. 1, 20-25 doi:10.17265/2159-5542/2018.01.003 D DAVID PUBLISHING The Relationship Between Immediate Response Syndrome and the Expectations Toward Artificial
More informationAcquisition of Human Operation Characteristics for Kite-based Tethered Flying Robot using Human Operation Data
Acquisition of Human Operation Characteristics for Kite-based Tethered Flying Robot using Human Operation Data Chiaki Todoroki and Yasutake Takahashi Dept. of Human & Artificial Intelligent Systems, Graduate
More informationOpen Access Partial Discharge Fault Decision and Location of 24kV Multi-layer Porcelain Insulator based on Power Spectrum Density Algorithm
Send Orders for Reprints to reprints@benthamscience.ae 342 The Open Electrical & Electronic Engineering Journal, 15, 9, 342-346 Open Access Partial Discharge Fault Decision and Location of 24kV Multi-layer
More informationHaptic presentation of 3D objects in virtual reality for the visually disabled
Haptic presentation of 3D objects in virtual reality for the visually disabled M Moranski, A Materka Institute of Electronics, Technical University of Lodz, Wolczanska 211/215, Lodz, POLAND marcin.moranski@p.lodz.pl,
More informationKeywords: Multi-robot adversarial environments, real-time autonomous robots
ROBOT SOCCER: A MULTI-ROBOT CHALLENGE EXTENDED ABSTRACT Manuela M. Veloso School of Computer Science Carnegie Mellon University Pittsburgh, PA 15213, USA veloso@cs.cmu.edu Abstract Robot soccer opened
More informationENHANCED HUMAN-AGENT INTERACTION: AUGMENTING INTERACTION MODELS WITH EMBODIED AGENTS BY SERAFIN BENTO. MASTER OF SCIENCE in INFORMATION SYSTEMS
BY SERAFIN BENTO MASTER OF SCIENCE in INFORMATION SYSTEMS Edmonton, Alberta September, 2015 ABSTRACT The popularity of software agents demands for more comprehensive HAI design processes. The outcome of
More informationContents. Mental Commit Robot (Mental Calming Robot) Industrial Robots. In What Way are These Robots Intelligent. Video: Mental Commit Robots
Human Robot Interaction for Psychological Enrichment Dr. Takanori Shibata Senior Research Scientist Intelligent Systems Institute National Institute of Advanced Industrial Science and Technology (AIST)
More informationStabilize humanoid robot teleoperated by a RGB-D sensor
Stabilize humanoid robot teleoperated by a RGB-D sensor Andrea Bisson, Andrea Busatto, Stefano Michieletto, and Emanuele Menegatti Intelligent Autonomous Systems Lab (IAS-Lab) Department of Information
More informationModelling and Simulation of Tactile Sensing System of Fingers for Intelligent Robotic Manipulation Control
20th International Congress on Modelling and Simulation, Adelaide, Australia, 1 6 December 2013 www.mssanz.org.au/modsim2013 Modelling and Simulation of Tactile Sensing System of Fingers for Intelligent
More informationDevelopment of an Automatic Camera Control System for Videoing a Normal Classroom to Realize a Distant Lecture
Development of an Automatic Camera Control System for Videoing a Normal Classroom to Realize a Distant Lecture Akira Suganuma Depertment of Intelligent Systems, Kyushu University, 6 1, Kasuga-koen, Kasuga,
More informationIN MOST human robot coordination systems that have
IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS, VOL. 54, NO. 2, APRIL 2007 699 Dance Step Estimation Method Based on HMM for Dance Partner Robot Takahiro Takeda, Student Member, IEEE, Yasuhisa Hirata, Member,
More informationThe effect of gaze behavior on the attitude towards humanoid robots
The effect of gaze behavior on the attitude towards humanoid robots Bachelor Thesis Date: 27-08-2012 Author: Stefan Patelski Supervisors: Raymond H. Cuijpers, Elena Torta Human Technology Interaction Group
More informationCORC 3303 Exploring Robotics. Why Teams?
Exploring Robotics Lecture F Robot Teams Topics: 1) Teamwork and Its Challenges 2) Coordination, Communication and Control 3) RoboCup Why Teams? It takes two (or more) Such as cooperative transportation:
More informationTouch Perception and Emotional Appraisal for a Virtual Agent
Touch Perception and Emotional Appraisal for a Virtual Agent Nhung Nguyen, Ipke Wachsmuth, Stefan Kopp Faculty of Technology University of Bielefeld 33594 Bielefeld Germany {nnguyen, ipke, skopp}@techfak.uni-bielefeld.de
More informationSocial Acceptance of Humanoid Robots
Social Acceptance of Humanoid Robots Tatsuya Nomura Department of Media Informatics, Ryukoku University, Japan nomura@rins.ryukoku.ac.jp 2012/11/29 1 Contents Acceptance of Humanoid Robots Technology Acceptance
More information