Using a Robot's Voice to Make Human-Robot Interaction More Engaging
|
|
- Dwain Nelson
- 6 years ago
- Views:
Transcription
1 Using a Robot's Voice to Make Human-Robot Interaction More Engaging Hans van de Kamp University of Twente P.O. Box 217, 7500AE Enschede The Netherlands h.vandekamp@student.utwente.nl ABSTRACT Nowadays a robot is becoming more than just a machine, the robot becomes an interaction partner. A human needs to be engaged to interact with the robot. This paper is about an experiment on robot voices in a task-based environment. The goal was to determine the influence of the robot's voice on the way humans are engaged or interested to perform a certain task. This research is contributing to the topic of engagement in human-robot interaction with different voice styles. The participant is asked to perform six small assignments to measure the effects of the different voices; a human-like voice (N=10) and a machine-like or mechanical voice (N=11). There were some significant differences between the two voices, mostly related to the likeability of the robot. The differences between the voices in terms of interest or engagement turned out to be minimal and not significant. Keywords Robot voice, challenging, engagement, human-robot interaction, task interest, robot interest 1. INTRODUCTION Different frameworks for human-robot interaction have been created in the past years [1]. Most of them are trying to improve human-robot interaction by incorporating human behavior and human personality traits in robots [7, 16]. Voice is an important factor in human personality [8], therefore many robots use a human-like voice to interact with humans. Creating a human-like interaction partner has proven to be valuable in human-robot interaction [4, 10] in terms of effectiveness and efficiency [11, 15]. This research will not evaluate human-robot interaction by measuring task effectiveness or task efficiency, but it will evaluate humanrobot interaction by the way humans are engaged [12] while performing a certain assignment. This research aims to provide insight on the relationship between a robot s voice and engagement [12] in terms of interest. Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. 18 th Twente Student Conference on IT, January 25 th, 2013, Enschede, The Netherlands. Copyright 2013, University of Twente, Faculty of Electrical Engineering, Mathematics and Computer Science. 1.1 Problem statement Other research [4] focuses on models in which robots mirror the participants behavior to create a more human-like interaction partner. However, most of that research does not focus on engaging a human. This paper focuses only on engaging a participant in human-robot Engaging is related to the concept of interest as it describes attentional and emotional involvement [12]. The goal of this research is to determine the influence of a robot s voice on the level of engagement in terms of interest [12] in human-robot It seems normal to use a more human-like robot to improve human-robot interaction, assuming that the robot is perceived as human-like. However, it is a fact that most robots are far from being like a human. If a human perceives the robot as artificial and not as human-like, it might influence the expectancy of the robot voice. If a robot is perceived as artificial and uses a mechanical voice instead of the human-like voice, does this influence the degree of which humans are engaged to perform a certain task? Are humans more interested in a robot with a mechanical voice instead of a human-like voice? Both questions are important to understand the relationship between humans and robots in human-robot Or in this paper more specific: the relationship between voice and engagement. With the previous questions in mind, the following research questions can be created. 1.2 Research Questions To investigate the relationship between a robot s voice and participant interest towards the robot or task a few questions need to be answered. The main research question is: Does a robot with a humanlike voice make human-robot interaction more engaging than a mechanical voice? Other research questions will be used to find an answer to the main question: - How can we find out if a person is engaged or interested in the robot or task while interacting with a robot? - Does a robot s voice influence the way participants are engaged to perform a task in terms of interest? Which can be divided into two more questions: o o Does a robot s voice influence the way humans are interested in the robot? Does a robot s voice influence the way humans are interested in the task?
2 2. RELATED WORK Research on human-robot interaction is often about creating a lifelike interaction partner. Therefore a lot of research is conducted with a human-like voice only, of which some are mentioned in the introduction. Important related work to this research is conducted by Walters et al. Rich and Sidner also wrote about the concept of engagement in human-robot However, they did not use different voice styles or robot appearances like Walters et al. A paper on robot appearance and personality by Walters et al. [17] investigated people s perceptions of different robot appearances. The research uses the definitions of robot appearance based on the definitions of Gong and Nass [5] and MacDorman and Ishiguro [9]. In this experiment the robot appearances are referred to as human-like or machine-like. This paper describes the relation between interest and robot voices, it does not talk about appearance preferences in general. years with an average of 21,6 years. The second oldest participant was 24 years old. The experiment using the human voice had 10 participants and the experiment using the robot voice had of those participants were students of exact sciences such as Computer Science, Electrical Engineering, Mathematics or a master in the same area. The other student studied Climate & management and was apparently lost.. Because of the large amount of students with a technical background only 4 participants were female. They were equally divided over both conditions. Other research by Walters et al. [18] used different robot voices similar to the experiment described in this paper. There has also been research on the gender of the robot. Siegel et al. [13] conducted research to determine the preferences of males and females on robot gender. The research showed that females generally did not have a preference of robot gender. However, males seem to have a preference of a female robot gender. Research performed by Crowell et al. [3] indicate that the perceived gender of the robot may influence human sexrelated characteristics. 3. METHODOLOGY In this experiment the Magabot robot is used. A photo of the robot is shown in the figure at the right. The robot is small (less than 1 meter tall) and offers a platform on which a laptop can be placed. The robot will be controlled using a Wizard of Oz technique because of communication problems between the robot and Flash. The robot will drive on a platform between two tables, which will be explained in section 3.3. The voice and eyes are programmed in Flash (ActionScript). The robot used a predefined script in the form of a Flash timeline. The timeline was divided into several segments to allow a researcher to control the robot's script. As seen in the figure at the right, the laptop displayed the robot's eyes. The robot eyes and effects such as blinking were used to make the robot more life-like. In the experiment the robot eyes were used to look at objects. Based on the research conducted by Siegel et al. [13], the robot had a female voice. Both the human-like and machinelike voice were female (or female-like) voices. The used voices are discussed in section 3.2. The robot itself had no gender. The gender was determined by the voice of the robot. The robot introduced herself as Jane. 3.1 Participants A total of 21 participants took part in the experiment using a between subjects design. Their age varied between 19 and 31 Figure 1: the Magabot with laptop and eyes used in the experiment. 3.2 Voice Two different female voices were used in the experiment. The robot used a predefined script (in English) to communicate with the participants. Both voices were created with the MARY Text-to-Speech system. The first group of participants (N=10) interacted with a robot using a synthesized human-like female voice. The second group of participants (N=11) interacted with a robot using a synthesized machine-like or mechanical-like female voice. The used voice was in MARY called 'cmu-slt-hsmm'. The second group had a robot filter to create a more machine-like or mechanical voice. 3.3 Experiment Setup The conducted experiment was combined with research about proxemics in human-robot interaction, or more specific the relationship between a robot's voice and proxemics. The experiment was divided in three parts. In the first part the robot tells its name and asks the participant to come closer. This part was needed for the other experiment and is also a good introduction of the robot. The participant walks towards the robot and the robot then tells the participant to do some simple tasks. The participant is asked to take a seat
3 at the table. The robot drives to the other side of the table, making sure the participant is sitting opposite of the robot. For the second part of the experiment, the participant performs six simple assignments. On the table are six cards with letters forming the word thanks and also six (empty) numbered boxes. Three of the cards are faced up and show the letter, the others show the backside which is colored (red, green and blue). The six assignments are small assignments such as move the letter N to box 2, swap the letter N with the letter H or please turn over all colored squares. The start of each of the assignments was triggered by hand using a Wizard of Oz technique. After some assignments the participant tells the robot how many boxes are left empty to enforce interaction with the robot. After moving all cards the robot asks the participant to flip all colored cards and read the word in the boxes. The robot then thanks the participant and drives to the table on the left-hand side of the participant. The third part of the experiment is very small. When the robot arrives at the table it turns around and asks the participant to come closer. This part is also needed for the other experiment. The robot tells the participant there is a questionnaire on the table next to her and asks the participant to fill it in. The experiment has ended. The figure below shows a picture taken from the experiment setup. The table with the six squares is in front of a platform on which the robot drives around. This was necessary because otherwise the robot was too small and the participant would not have been able to see the robot s eyes. The moving assignments are similar to the ones used by Staudte et al. [14]. However this experiment is not based on utterance or gesture, but instead focuses on the two different voices. Figure 2: the experiment setup. 3.4 Questionnaire & video footage A three-page questionnaire was used to determine if a person was engaged or interested in the robot or task. The questionnaire is based on measurement instruments for measuring anthropomorphism, animacy, likeability, perceived intelligence and perceived safety of robots provided by Bartneck et al. [2] to get understanding of how the robot was perceived by the participants. The questionnaire also contains questions on the attention allocation scale provided by Harms and Biocca [6] to find out if the participants were interested in the robot or the task. The questionnaire consists of five parts. The first part is about some general information. The second, third and fifth part contains 32 questions about the experiment on a 5-point Likert scale provided by Bartneck et al. [2]. The fourth part contains 9 questions which are rated on a scale from 1 to 7, of which some are provided by Harms and Biocca [6]. Because the conducted experiment is shared with Rens Hoegen not all questions will be used for providing answers in this paper Most important questions The categories rating of the participant s have some important questions in order to answer the research questions: - artificial / lifelike & mechanical / Organic: questions in the categories anthropomorphism and animacy to determine how the robot's appearance was perceived. - unfriendly / friendly & unpleasant / pleasant: questions in the category likeability to rate the impression of the robot. - incompetent / competent & unintelligent / intelligent: questions in the category perceived intelligence. Another category contains questions to which the participant can disagree or agree. The most important are: - I was interested in the robot. - I was interested in the task. - I remained focused on the robot throughout our - I remained focused on the task throughout our The questionnaire ends with a manipulation check to determine if the difference in voices was noticed. The experiment was also recorded by two different cameras (one full HD camcorder and one fisheye camera) in order to support the questionnaire. Some small screenshots taken from the video footage can be found in section 4.2. The questions of the questionnaire can be found in Appendix A. 4. RESULTS Most of the 41 questions from the questionnaire were used to analyze the relationship between voices and the level of interest. Some questions were not relevant for this research because it was a combined experiment. At first it is important to determine the reliability of the questions, after that the questions will be used to answer the research questions. Finally some screenshots of the video footage is shown to highlight some details. Figure 3: a schematic representation of the experiment.
4 4.1 Questionnaire The categories anthropomorphism, animacy, likeability, perceived intelligence and perceived safety suggested by Bartneck et al. [2] were all used in the questionnaire Reliability The Cronbach Alpha value of each of the categories is listed in the third column of the table below. The second column shows which questions were used in that particularly category. The matching questions can be found in Appendix A. Table 1: Cronbach Alpha values for each category before removing questions. Category Questions (#) Cronbach Alpha Anthropomorphism 1, 5, 9, 13, Animacy 2, 6, 10, 13, 14, Likeability 3, 7, 11, 15, Perc. intelligence 4, 8, 12, 16, Perc. safety 24, 25, Because most Cronbach Alpha values were not very reliable some questions have been deleted. To achieve a Cronbach Alpha value of at least 0.70 in each category the following questions have been deleted: - moving rigidly / moving elegantly & unconscious / conscious in the category anthropomorphism. - apathetic / responsive, stagnant / lively & inert / interactive in the category animacy. - foolish / sensible in the category perceived intelligence. - quiescent / surprised in the category perceived safety. It is important to state that some participants had trouble with the meaning of apathetic and quiescent, all participants were Dutch. This results in the following Cronbach Alpha values: Table 2: Cronbach Alpha values for each category after removing questions. Category Questions (#) Cronbach Alpha Anthropomorphism 1, 5, Animacy 2, 10, Likeability 3, 7, 11, 15, Perc. intelligence 4, 8, 12, Perc. safety 24, Analysis In the tables below the results of the Independent Samples T- tests is shown. In table 3 the results of the test are shown for Bartneck s categories. The results of the questions used in the manipulation check are shown in table 4. The other questions, including the questions taken from Harms and Biocca are evaluated separately in tables 5 and 6. Table 3: Independent Samples T-tests results for Bartneck s categories with both conditions. Category Voice M SD t(19) p Anthropomorphism Animacy Likeability Perceived intelligence Perceived safety Human-like Machine-like Human-like Machine-like Human-like Machine-like Human-like Machine-like Human-like Machine-like Analyzing and comparing the two tables above shows that the categories animacy, perceived intelligence and perceived safety have no significant difference between the human-like voice and the machine-like voice. The category likeability on the contrary does show a significant difference between the human-like voice and the machine-like voice. The category anthropomorphism was approaching significance. Table 4: Independent Samples T-tests results for the manipulation check with both conditions. Question Voice M SD T(19) p Machinelike / Humanlike Unpleasant / Pleasant Disengaging / Engaging Unclear / Clear Human-like Machine-like Human-like Machine-like Human-like Machine-like Human-like Machine-like The questions machinelike / humanlike, unpleasant / pleasant & disengaging / engaging in the manipulation check about the voice only showed minor differences, mostly in favor of the human-like model. The most important significant difference can be found in the question unclear / clear. It showed that the human-like voice was more clear than the machine-like voice and that a difference in voices was noticed. This can also be seen in the video footage, discussed in section 4.2 below. Some 7-point scale questions are excluded in this research, the others are listed below in tables 5 and 6. For each of the questions the Independent Samples T-test is performed and listed below. Table 5: Independent Samples T-tests results for the remaining questions with the human-like voice. Question M SD t(19) p I feel that the robot is interesting to look at
5 I was interested in the robot I was interested in the task I was easily distracted from the robot when other things were going on. (Recoded) I remained focused on the robot throughout our I remained focused on the task throughout our Understanding the robot was difficult. (Recoded *) voices. These pictures will be used in section 6 to support the conclusion. Figure 3: the participant is fixating on the robot after completing an assignment. Table 6: Independent Samples T-tests results for the remaining questions with the machine-like voice. Question M SD t(19) p I feel that the robot is interesting to look at I was interested in the robot I was interested in the task I was easily distracted from the robot when other things were going on. (Recoded) I remained focused on the robot throughout our I remained focused on the task throughout our Understanding the robot was difficult. (Recoded *) Figure 4: her facial expressions indicate that she has difficulties with understanding the robot. The above tables show that the differences between the two voices in terms of interest is not very significant. Thought it is interesting to know that there is a slight difference in focusing on either the task or the robot. * Note that some questions are recoded to make sure that all questions are rated the same way, meaning that 1 = negative, 4 = neutral and 7 = positive. In the question Understanding the robot was difficult. 1 was positive (not difficult) and 7 was negative (difficult), therefore the question is recoded to match its rating with the other questions. 4.2 Video footage In the previous section it became clear that the human-like voice was more likeable and that understanding the robot was slightly more difficult with the machine-like voice. The machine-like voice was rated as more unclear compared to the human-like voice. The video footage showed the exact same results. The pictures below show some information about the Figure 5: the participant is staring at the background, thinking about what the robot just said. 5. DISCUSSION The difference between the two used voices turned out to be not very significant. The most important differences were found in Bartneck's category likeability and the question unclear / clear in the manipulation check. The human-like voice scored higher on the likeability scale and was found more clear than the machine-like voice. The questions in the category anthropomorphism were only approaching significance and showed that the human-like voice was perceived as slightly more anthropomorphic. There was also a minor difference in understanding the robot. The machine-like voice was harder to understand than the human-like voice, which relates to the difference found in the unclear / clear question.
6 The relationship between voice and interest is not significant. In the introduction the question 'If a robot is perceived as artificial and uses a mechanical voice instead of the usual human-like voice, does this influence the way humans are engaged to perform a certain task?' came to mind. The results show that the influence of the voices is minimal. The robot with the human-like voice was perceived only a little more interesting. The results in table 5 and 6 appear to show a contradiction. The human-model scored slightly better on the questions about interest. However, the questions about focus show a minor difference in favor of the machine-like voice. 6. CONCLUSION As mentioned above, some results appear to be contradicting. Though these results are not as significant as hoped, the difference can still be explained. Because the machine-like voice was rated less clear than the human-like voice, some participants had difficulties with understanding the robot. Section 4.2 shows three pictures taken from the video footage. Figure 4 shows a participant with facial expressions indicating it takes some effort to understand the robot. Figure 5 shows a participant thinking about what the robot said. It seems that the participants with the machine-like voice focused more on the robot because of the unclear voice. With the above results, the research questions mentioned in section 1.2 can be answered: This research showed that a robot's voice does not significantly influence the way participants are engaged to perform a certain task. Nor were the participants significantly more interested in the robot or task. The human-like voice in general scored overall better than the machine-like voice. This result is similar to the results found by other researchers mentioned in the introduction. 7. FUTURE WORK In order to completely understand the relationship between voice and engagement, more research needs to be conducted. To improve this research better utilities are necessary. The used Magabot did not have a body (or body movement) and lacked facial expressions. To improve this experiment some aspects of the robot such as embodiment and perceived gender (more than just the voice of the robot) should be implemented. The robot was not quite socially interactive. The video footage could have been of greater importance to this research. Due to time issues, it was only possible to use the video footage as support to the questionnaire. The analysis of the video footage might reveal more interesting facts in a larger study. Some participants asked how the robot knew they were finished with a certain task. Parts of the experiment were conducted using the Wizard of Oz technique, which might have influenced the way the participant perceived the robot. In the future it is necessary to create a robot which responds autonomous based on the actions of the participant. Future work on this research might use a larger group of participants with equally mixed gender and a less technical background. The experiment was also conducted in a taskbased environment, experimenting in a more real-life situation would be more appropriate. 8. ACKNOWLEDGEMENTS I would like to thank Betsy van Dijk and Manja Lohse for providing guidance and solving problems during the research. And I would also like to thank the University of Twente for providing all necessary utilities to conduct the experiments. My thanks also go to Gilberto Sepúlveda Bradford and Rens Hoegen for helping with the robot and the experiments. 9. REFERENCES [1] C. Bartneck and J. Forlizzi. A Design-Centred Framework for Social Human-Robot Interaction. In Proceedings of the Ro-Man2004, Kurashiki pp ACM. DOI: [2] C. Bartneck, D. Kulić, E. Croft and S. Zoghbi. Measurement Instruments for the Anthropomorphism, Animacy, Likeability, Perceived Intelligence, and Perceived Safety of Robots. In International Journal of Social Robotics, Vol 1(1), DOI= [3] C.R. Crowelly, M. Villanoy, M. Scheutz and P. Schermerhorn. Gendered voice and robot entities: Perceptions and reactions of male and female subjects. In Intelligent Robots and Systems, IROS 2009, IEEE/RSJ International Conference, DOI= [4] J. Goetz, S. Kiesler, and A. Powers. Matching robot appearance and behavior to tasks to improve humanrobot cooperation. In Proceedings of ROMAN 2003, the 12th IEEE International Workshop on Robot and Human Interactive Communication, 55-60, ACM. DOI= [5] L. Gong and C. Nass. When a talking-face computer agent is half-human and half-humanoid: human identity and consistency preference. In Journal of Human Communication Research, Vol 33(2), DOI= [6] C. Harms and F. A.Biocca. Internal consistency and reliability of the networked minds social presence measure. In M. Alcaniz & B. Rey (Eds.), Seventh Annual International Workshop ACM. DOI= [7] K. M. Lee, W. Peng, S. Jin and C. Yan. Can robots manifest personality? An empirical test of personality recognition, social responses, and social presence in human-robot In Journal of Communication, 56, DOI= [8] J. A. LePine and L. Van Dyne. Voice and cooperative behavior as contrasting forms of contextual performance: Evidence of differential relationships with Big Five personality characteristics and cognitive ability. In Journal of Applied Psychology, Vol 86(2), DOI= [9] K. MacDorman and H. Ishiguro. The uncanny advantage of using androids in cognitive and social science research. Interaction Studies, Vol 7(3), DOI=
7 [10] B. Mutlu, J. Forlizzi and J. Hodgins. A Storytelling Robot: Modeling and Evaluation of Human-like Gaze Behavior. In Proceedings of HUMANOIDS 06, ACM. DOI= [11] Y. Okuno, T. Kanda, M. Imai, H. Ishiguro, and N. Hagita. Providing route directions: design of robot s utterance, gesture, and timing. In Proceedings of the 4th ACM/IEEE international conference on Human robot interaction, HRI 09, 53-60, ACM. DOI= [12] C. Peters, G. Castellano and S. de Freitas. An exploration of user engagement in HCI. In Proceedings of the affective-aware virtual agents and social robots (AFFINE) workshop, international conference on multimodal interfaces, ICMI 09, ACM. DOI= [13] M. Siegel, C. Breazeal and M. I. Norton. Persuasive Robotics: The influence of robot gender on human behavior. In Intelligent Robots and Systems, IROS (2009) ACM. DOI= [14] M. Staudte and M. Crocker. Visual attention in spoken human-robot In Proceedings of the 4th ACM/IEEE international conference on Human robot interaction, HRI 09, 77 84, ACM. DOI= [15] A. Steinfeld, T. Fong, D. Kaber, M. Lewis, J. Scholtz, A. Schultz and M. Goodrich. Common metrics for humanrobot In Proceedings of the 1st ACM/IEEE International Conference on HumanRobot Interactions, HRI'06, ACM. DOI= [16] M. L. Walters, K. Dautenhahn, R. te Boekhorst, K. L. Koay, C. Kaouri, S. Woods, C. Nehaniv, D. Lee, and I. Werry. The influence of subjects personality traits on personal spatial zones in a human-robot interaction experiment. In Proceedings of COG SCI 2005: Toward Social Mechanisms of Android Science Workshop, 29-37, ACM. DOI= [17] M. L. Walters, D. S. Syrdal, K. Dautenhahn, R. te Boekhorst and K. L. Koay. Avoiding the uncanny valley: robot appearance, personality and consistency of behavior in an attention-seeking home scenario for a robot companion. Autonomous Robots, DOI= [18] M. L. Walters, D. S. Syrdal, K. L. Koay, K. Dautenhahn, R. te Boekhorst. Human approach distances to a mechanical-looking robot with different robot voice styles. In the 17th IEEE International Symposium on Robot and Human Interactive Communication, RO- MAN (2008), DOI=
8 APPENDIX A. RESULTS QUESTIONNAIRE Table A1: first part of the questionnaire containing questions rated on a scale from 1 to 5. Human-like voice Machine-like voice # Questions Mean Std. Deviation Mean Std. Deviation Rate the impression of the robot on a scale from 1 to 5. 1 Fake / Natural Dead / Alive Dislike / Like Incompetent / Competent Machinelike / Humanlike Stagnant / Lively Unfriendly / Friendly Ignorant / Knowledgeable Unconscious / Conscious Mechanical / Organic Unkind / Kind Irresponsible / Responsible Artificial / Lifelike Inert / Interactive Unpleasant / Pleasant Unintelligent / Intelligent Moving rigidly / Moving elegantly Apathetic / Responsive Awful / Nice Foolish / Sensible Quiet / Loud Unhelpful / Helpful Intimidating / Inviting Rate your emotional state on a scale from 1 to Anxious / Relaxed Agitated / Calm Quiescent / Surprised Unsafe / Safe Pressured / At ease
9 Table A2: second part of the questionnaire containing questions rated on a scale from 1 to 7. Human-like voice Machine-like voice # Questions Mean Std. Deviation Mean Std. Deviation Give your opinion on the following statements. (Scale from 1 to 7, strongly disagree to strongly agree.) 29 I feel that the robot is interesting to look at I was interested in the robot I was interested in the task I was easily distracted from the robot when other things were going on. 33 I remained focused on the robot throughout our 34 I remained focused on the task throughout our Understanding the robot was difficult Throughout our interaction I became more familiar with the robot I felt uncomfortable when I was close to the robot Table A3: third part of the questionnaire containing questions rated on a scale from 1 to 5, used as the manipulation check. Human-like voice Machine-like voice # Questions Mean Std. Deviation Mean Std. Deviation Rate your impression of the voice of the robot on the following scales. 38 Machinelike / Humanlike Unpleasant / Pleasant Disengaging / Engaging Unclear / Clear
The effect of gaze behavior on the attitude towards humanoid robots
The effect of gaze behavior on the attitude towards humanoid robots Bachelor Thesis Date: 27-08-2012 Author: Stefan Patelski Supervisors: Raymond H. Cuijpers, Elena Torta Human Technology Interaction Group
More informationHuman-Robot Collaborative Dance
Human-Robot Collaborative Dance Nikhil Baheti, Kim Baraka, Paul Calhoun, and Letian Zhang Mentor: Prof. Manuela Veloso 16-662: Robot autonomy Final project presentation April 27, 2016 Motivation - Work
More informationThe Influence of Approach Speed and Functional Noise on Users Perception of a Robot
2013 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) November 3-7, 2013. Tokyo, Japan The Influence of Approach Speed and Functional Noise on Users Perception of a Robot Manja
More informationProceedings of th IEEE-RAS International Conference on Humanoid Robots ! # Adaptive Systems Research Group, School of Computer Science
Proceedings of 2005 5th IEEE-RAS International Conference on Humanoid Robots! # Adaptive Systems Research Group, School of Computer Science Abstract - A relatively unexplored question for human-robot social
More informationPreferences and Perceptions of Robot Appearance and Embodiment in Human-Robot Interaction Trials. 1
Preferences and Perceptions of Robot Appearance and Embodiment in Human-Robot Interaction Trials. 1 Michael L. Walters, Kheng Lee Koay, Dag Sverre Syrdal, Kerstin Dautenhahn and René te Boekhorst. 2 Abstract.
More informationCreating Robots with Personality: The Effect of Personality on Social Intelligence
Creating Robots with Personality: The Effect of Personality on Social Intelligence Alexandros Mileounis ( ), Raymond H. Cuijpers, and Emilia I. Barakova Eindhoven University of Technology, P.O. Box 513,
More informationWHAT HAPPENS WHEN A ROBOT FAVORS SOMEONE? HOW A MUSEUM-GUIDE ROBOT USES GAZE BEHAVIOR TO ADDRESS MULTIPLE PERSONS WHILE STORYTELLING ABOUT ART
WHAT HAPPENS WHEN A ROBOT FAVORS SOMEONE? HOW A MUSEUM-GUIDE ROBOT USES GAZE BEHAVIOR TO ADDRESS MULTIPLE PERSONS WHILE STORYTELLING ABOUT ART Gilberto U. Sepúlveda Bradford FACULTY OF ELECTRICAL ENGINEERING,
More informationThe effects of an impolite vs. a polite robot playing rock-paper-scissors
The effects of an impolite vs. a polite robot playing rock-paper-scissors Álvaro Castro-González 1, José Carlos Castillo 1, Fernando Alonso-Martín 1, Olmer V. Olortegui-Ortega 1, Victor González-Pacheco
More informationEvaluating 3D Embodied Conversational Agents In Contrasting VRML Retail Applications
Evaluating 3D Embodied Conversational Agents In Contrasting VRML Retail Applications Helen McBreen, James Anderson, Mervyn Jack Centre for Communication Interface Research, University of Edinburgh, 80,
More informationRobot to Human Approaches: Preliminary Results on Comfortable Distances and Preferences
Robot to Human Approaches: Preliminary Results on Comfortable Distances and Preferences Michael L. Walters, Kheng Lee Koay, Sarah N. Woods, Dag S. Syrdal, K. Dautenhahn Adaptive Systems Research Group,
More informationFROG - Fun Robotic Outdoor Guide
FROG - Fun Robotic Outdoor Guide Deliverable: D4.1 part d Identification, evaluation and design of guide robot personality and behaviours: Design Guidelines for Robot Personality Consortium UNIVERSITEIT
More informationA SURVEY OF SOCIALLY INTERACTIVE ROBOTS
A SURVEY OF SOCIALLY INTERACTIVE ROBOTS Terrence Fong, Illah Nourbakhsh, Kerstin Dautenhahn Presented By: Mehwish Alam INTRODUCTION History of Social Robots Social Robots Socially Interactive Robots Why
More informationENHANCED HUMAN-AGENT INTERACTION: AUGMENTING INTERACTION MODELS WITH EMBODIED AGENTS BY SERAFIN BENTO. MASTER OF SCIENCE in INFORMATION SYSTEMS
BY SERAFIN BENTO MASTER OF SCIENCE in INFORMATION SYSTEMS Edmonton, Alberta September, 2015 ABSTRACT The popularity of software agents demands for more comprehensive HAI design processes. The outcome of
More informationAndroid as a Telecommunication Medium with a Human-like Presence
Android as a Telecommunication Medium with a Human-like Presence Daisuke Sakamoto 1&2, Takayuki Kanda 1, Tetsuo Ono 1&2, Hiroshi Ishiguro 1&3, Norihiro Hagita 1 1 ATR Intelligent Robotics Laboratories
More informationComparing Human Robot Interaction Scenarios Using Live and Video Based Methods: Towards a Novel Methodological Approach
Comparing Human Robot Interaction Scenarios Using Live and Video Based Methods: Towards a Novel Methodological Approach Sarah Woods, Michael Walters, Kheng Lee Koay, Kerstin Dautenhahn Adaptive Systems
More informationEffects of Gesture on the Perception of Psychological Anthropomorphism: A Case Study with a Humanoid Robot
Effects of Gesture on the Perception of Psychological Anthropomorphism: A Case Study with a Humanoid Robot Maha Salem 1, Friederike Eyssel 2, Katharina Rohlfing 2, Stefan Kopp 2, and Frank Joublin 3 1
More informationEssay on A Survey of Socially Interactive Robots Authors: Terrence Fong, Illah Nourbakhsh, Kerstin Dautenhahn Summarized by: Mehwish Alam
1 Introduction Essay on A Survey of Socially Interactive Robots Authors: Terrence Fong, Illah Nourbakhsh, Kerstin Dautenhahn Summarized by: Mehwish Alam 1.1 Social Robots: Definition: Social robots are
More informationSocial Robots Research Reports Project website: Institute website:
Orelena Hawks Puckett Institute Social Robots Research Reports, 2013, Number 2, 1-5. Social Robots Research Reports Project website: www.socialrobots.org Institute website: www.puckett.org Social Robots
More informationWedding Robotics: A case study
Wedding Robotics: A case study Pedro Vicente, Alexandre Bernardino Institute for Systems and Robotics (ISR/IST), LARSyS, Instituto Superior Técnico, Universidade de Lisboa, Lisboa, Portugal Email: {pvicente,alex}@isr.ist.utl.pt
More informationRole inconsistencies in elderly care robots
Role inconsistencies in elderly care robots You are doing your health exercises well, but I think I will win Veron Wormeester - 0758754 Date: 18-03-2014 Supervisors Raymond Cuijpers Wijnand IJsselsteijn
More informationDoes the Appearance of a Robot Affect Users Ways of Giving Commands and Feedback?
19th IEEE International Symposium on Robot and Human Interactive Communication Principe di Piemonte - Viareggio, Italy, Sept. 12-15, 2010 Does the Appearance of a Robot Affect Users Ways of Giving Commands
More informationClose Encounters: Spatial Distances between People and a Robot of Mechanistic Appearance *
Close Encounters: Spatial Distances between People and a Robot of Mechanistic Appearance * Michael L Walters, Kerstin Dautenhahn, Kheng Lee Koay, Christina Kaouri, René te Boekhorst, Chrystopher Nehaniv,
More informationAnalysis of humanoid appearances in human-robot interaction
Analysis of humanoid appearances in human-robot interaction Takayuki Kanda, Takahiro Miyashita, Taku Osada 2, Yuji Haikawa 2, Hiroshi Ishiguro &3 ATR Intelligent Robotics and Communication Labs. 2 Honda
More informationWho like androids more: Japanese or US Americans?
Proceedings of the 17th IEEE International Symposium on Robot and Human Interactive Communication, Technische Universität München, Munich, Germany, August 1-3, 2008 Who like androids more: Japanese or
More informationThis is a repository copy of Don t Worry, We ll Get There: Developing Robot Personalities to Maintain User Interaction After Robot Error.
This is a repository copy of Don t Worry, We ll Get There: Developing Robot Personalities to Maintain User Interaction After Robot Error. White Rose Research Online URL for this paper: http://eprints.whiterose.ac.uk/102876/
More informationDetermining appropriate first contact distance: trade-offs in human-robot interaction experiment design
Determining appropriate first contact distance: trade-offs in human-robot interaction experiment design Aaron G. Cass, Eric Rose, Kristina Striegnitz and Nick Webb 1 Abstract Robots are increasingly working
More informationFacilitating Employee Intention to Work with Robots
Facilitating Employee Intention to Work with Robots Research Idea Abstract Sangseok You Syracuse University syou03@syr.edu Lionel P. Robert Jr. University of Michigan lprobert@umich.edu Abstract Organizations
More informationTouch Perception and Emotional Appraisal for a Virtual Agent
Touch Perception and Emotional Appraisal for a Virtual Agent Nhung Nguyen, Ipke Wachsmuth, Stefan Kopp Faculty of Technology University of Bielefeld 33594 Bielefeld Germany {nnguyen, ipke, skopp}@techfak.uni-bielefeld.de
More informationProactive Behavior of an Autonomous Mobile Robot for Human-Assisted Learning
Proactive Behavior of an Autonomous Mobile Robot for Human-Assisted Learning A. Garrell, M. Villamizar, F. Moreno-Noguer and A. Sanfeliu Institut de Robo tica i Informa tica Industrial, CSIC-UPC {agarrell,mvillami,fmoreno,sanfeliu}@iri.upc.edu
More informationTowards an Imperfect Robot for Long-term Companionship: Case Studies Using Cognitive Biases
1 Towards an Imperfect Robot for Long-term Companionship: Case Studies Using Cognitive Biases Biswas, M. and Murray, J.C. Abstract The research presented in this paper aims to find out what affect cognitive
More informationAnthropomorphism and Human Likeness in the Design of Robots and Human-Robot Interaction
Anthropomorphism and Human Likeness in the Design of Robots and Human-Robot Interaction Julia Fink CRAFT, Ecole Polytechnique Fédérale de Lausanne, 1015 Lausanne, Switzerland julia.fink@epfl.ch Abstract.
More informationThe Good, The Bad, The Weird: Audience Evaluation of a Real Robot in Relation to Science Fiction and Mass Media
The Good, The Bad, The Weird: Audience Evaluation of a Real Robot in Relation to Science Fiction and Mass Media Ulrike Bruckenberger, Astrid Weiss, Nicole Mirnig, Ewald Strasser, Susanne Stadler, and Manfred
More informationSocial Robots and Human-Robot Interaction Ana Paiva Lecture 12. Experimental Design for HRI
Social Robots and Human-Robot Interaction Ana Paiva Lecture 12. Experimental Design for HRI Scenarios we are interested.. Build Social Intelligence d) e) f) Focus on the Interaction Scenarios we are interested..
More informationEmpirical Results from Using a Comfort Level Device in Human-Robot Interaction Studies
Empirical Results from Using a Comfort Level Device in Human-Robot Interaction Studies K.L. Koay, K. Dautenhahn, S.N. Woods and M.L. Walters University of Hertfordshire School of Computer Science College
More informationAll Robots Are Not Created Equal: The Design and Perception of Humanoid Robot Heads
All Robots Are Not Created Equal: The Design and Perception of Humanoid Robot Heads Carl F. DiSalvo, Francine Gemperle, Jodi Forlizzi, Sara Kiesler Human Computer Interaction Institute and School of Design,
More informationExploratory Study of a Robot Approaching a Person
Exploratory Study of a Robot Approaching a Person in the Context of Handing Over an Object K.L. Koay*, E.A. Sisbot+, D.S. Syrdal*, M.L. Walters*, K. Dautenhahn* and R. Alami+ *Adaptive Systems Research
More informationDrumming with a Humanoid Robot: Lessons Learnt from Designing and Analysing Human-Robot Interaction Studies
Drumming with a Humanoid Robot: Lessons Learnt from Designing and Analysing Human-Robot Interaction Studies Hatice Kose-Bagci, Kerstin Dautenhahn, and Chrystopher L. Nehaniv Adaptive Systems Research Group
More informationRobots Have Needs Too: People Adapt Their Proxemic Preferences to Improve Autonomous Robot Recognition of Human Social Signals
Robots Have Needs Too: People Adapt Their Proxemic Preferences to Improve Autonomous Robot Recognition of Human Social Signals Ross Mead 1 and Maja J Matarić 2 Abstract. An objective of autonomous socially
More informationMachine Trait Scales for Evaluating Mechanistic Mental Models. of Robots and Computer-Based Machines. Sara Kiesler and Jennifer Goetz, HCII,CMU
Machine Trait Scales for Evaluating Mechanistic Mental Models of Robots and Computer-Based Machines Sara Kiesler and Jennifer Goetz, HCII,CMU April 18, 2002 In previous work, we and others have used the
More informationSocial Acceptance of Humanoid Robots
Social Acceptance of Humanoid Robots Tatsuya Nomura Department of Media Informatics, Ryukoku University, Japan nomura@rins.ryukoku.ac.jp 2012/11/29 1 Contents Acceptance of Humanoid Robots Technology Acceptance
More informationAll Robots Are Not Created Equal: The Design and Perception of Humanoid Robot Heads
All Robots Are Not Created Equal: The Design and Perception of Humanoid Robot Heads This paper presents design research conducted as part of a larger project on human-robot interaction. The primary goal
More informationEffects of Nonverbal Communication on Efficiency and Robustness in Human-Robot Teamwork
Effects of Nonverbal Communication on Efficiency and Robustness in Human-Robot Teamwork Cynthia Breazeal, Cory D. Kidd, Andrea Lockerd Thomaz, Guy Hoffman, Matt Berlin MIT Media Lab 20 Ames St. E15-449,
More informationMIN-Fakultät Fachbereich Informatik. Universität Hamburg. Socially interactive robots. Christine Upadek. 29 November Christine Upadek 1
Christine Upadek 29 November 2010 Christine Upadek 1 Outline Emotions Kismet - a sociable robot Outlook Christine Upadek 2 Denition Social robots are embodied agents that are part of a heterogeneous group:
More informationEffect of Cognitive Biases on Human-Robot Interaction: A Case Study of Robot's Misattribution
Effect of Cognitive Biases on Human-Robot Interaction: A Case Study of Robot's Misattribution Biswas, M. and Murray, J. Abstract This paper presents a model for developing longterm human-robot interactions
More informationVoice Activation Control with Digital Assistant for Humanoid Robot Torso
Voice Activation Control with Digital Assistant for Humanoid Robot Torso Conor Wallace conorw8@gmail.com Department of Electrical and Computer Engineering University of Texas at San Antonio Abstract Digital
More informationAdaptive Human aware Navigation based on Motion Pattern Analysis Hansen, Søren Tranberg; Svenstrup, Mikael; Andersen, Hans Jørgen; Bak, Thomas
Aalborg Universitet Adaptive Human aware Navigation based on Motion Pattern Analysis Hansen, Søren Tranberg; Svenstrup, Mikael; Andersen, Hans Jørgen; Bak, Thomas Published in: The 18th IEEE International
More informationTrust, Satisfaction and Frustration Measurements During Human-Robot Interaction Moaed A. Abd, Iker Gonzalez, Mehrdad Nojoumian, and Erik D.
Trust, Satisfaction and Frustration Measurements During Human-Robot Interaction Moaed A. Abd, Iker Gonzalez, Mehrdad Nojoumian, and Erik D. Engeberg Department of Ocean &Mechanical Engineering and Department
More informationINTERACTIONS WITH ROBOTS:
INTERACTIONS WITH ROBOTS: THE TRUTH WE REVEAL ABOUT OURSELVES Annual Review of Psychology Vol. 68:627-652 (Volume publication date January 2017) First published online as a Review in Advance on September
More informationEmpathy Objects: Robotic Devices as Conversation Companions
Empathy Objects: Robotic Devices as Conversation Companions Oren Zuckerman Media Innovation Lab School of Communication IDC Herzliya P.O.Box 167, Herzliya 46150 ISRAEL orenz@idc.ac.il Guy Hoffman Media
More informationUnderstanding Anthropomorphism: Anthropomorphism is not a Reverse Process of Dehumanization
Understanding Anthropomorphism: Anthropomorphism is not a Reverse Process of Dehumanization Jakub Z lotowski 1,2(B), Hidenobu Sumioka 2, Christoph Bartneck 1, Shuichi Nishio 2, and Hiroshi Ishiguro 2,3
More informationEvaluating Fluency in Human-Robot Collaboration
Evaluating Fluency in Human-Robot Collaboration Guy Hoffman Media Innovation Lab, IDC Herzliya P.O. Box 167, Herzliya 46150, Israel Email: hoffman@idc.ac.il Abstract Collaborative fluency is the coordinated
More informationPersuasive Robotics: the influence of robot gender on human behavior
Persuasive Robotics: the influence of robot gender on human behavior The MIT Faculty has made this article openly available. Please share how this access benefits you. Your story matters. Citation As Published
More informationA Long-Term Human-Robot Proxemic Study
A Long-Term Human-Robot Proxemic Study Michael L. Walters, Mohammedreza A. Oskoei, Dag Sverre Syrdal and Kerstin Dautenhahn, Member, IEEE Abstract A long-term Human-Robot Proxemic (HRP) study was performed
More informationBridging the gap between users' expectations and system evaluations
Bridging the gap between users' expectations and system evaluations Manja Lohse Abstract What users expect of a robot strongly influences their ratings of the interaction. If the robot satisfies the expectations,
More informationNatural Interaction with Social Robots
Workshop: Natural Interaction with Social Robots Part of the Topig Group with the same name. http://homepages.stca.herts.ac.uk/~comqkd/tg-naturalinteractionwithsocialrobots.html organized by Kerstin Dautenhahn,
More informationThe USUS Evaluation Framework for Human-Robot Interaction
The USUS Evaluation Framework for Human-Robot Interaction Astrid Weiss 1, Regina Bernhaupt 2, Michael Lankes 1 and Manfred Tscheligi 1 Abstract. To improve the way humans are interacting with robots various
More informationIntroduction to This Special Issue on Human Robot Interaction
HUMAN-COMPUTER INTERACTION, 2004, Volume 19, pp. 1 8 Copyright 2004, Lawrence Erlbaum Associates, Inc. Introduction to This Special Issue on Human Robot Interaction Sara Kiesler Carnegie Mellon University
More informationPreliminary Investigation of Moral Expansiveness for Robots*
Preliminary Investigation of Moral Expansiveness for Robots* Tatsuya Nomura, Member, IEEE, Kazuki Otsubo, and Takayuki Kanda, Member, IEEE Abstract To clarify whether humans can extend moral care and consideration
More informationREBO: A LIFE-LIKE UNIVERSAL REMOTE CONTROL
World Automation Congress 2010 TSI Press. REBO: A LIFE-LIKE UNIVERSAL REMOTE CONTROL SEIJI YAMADA *1 AND KAZUKI KOBAYASHI *2 *1 National Institute of Informatics / The Graduate University for Advanced
More informationIntended Type of Contribution Keywords Relationship Between Most Closely Related Prior Papers Role of Authors (Optional) Suggested Reviewers
This robot stinks! Differences between perceived mistreatment Zachary Carlson zack@nevada.unr.edu Louise Lemmon louise@nevada.unr.edu MacCallister Higgins mac@nevada.unr.edu David Frank davidfrank@nevada.unr.edu
More informationThis is a repository copy of Designing robot personalities for human-robot symbiotic interaction in an educational context.
This is a repository copy of Designing robot personalities for human-robot symbiotic interaction in an educational context. White Rose Research Online URL for this paper: http://eprints.whiterose.ac.uk/102874/
More informationEvaluation of Passing Distance for Social Robots
Evaluation of Passing Distance for Social Robots Elena Pacchierotti, Henrik I. Christensen and Patric Jensfelt Centre for Autonomous Systems Royal Institute of Technology SE-100 44 Stockholm, Sweden {elenapa,hic,patric}@nada.kth.se
More informationEngagement During Dialogues with Robots
MITSUBISHI ELECTRIC RESEARCH LABORATORIES http://www.merl.com Engagement During Dialogues with Robots Sidner, C.L.; Lee, C. TR2005-016 March 2005 Abstract This paper reports on our research on developing
More informationEvaluating the Augmented Reality Human-Robot Collaboration System
Evaluating the Augmented Reality Human-Robot Collaboration System Scott A. Green *, J. Geoffrey Chase, XiaoQi Chen Department of Mechanical Engineering University of Canterbury, Christchurch, New Zealand
More informationThe Influence of Subjects Personality Traits on Predicting Comfortable Human- Robot Approach Distances
The Influence of Subjects Personality Traits on Predicting Comfortable Human- Robot Approach Distances Michael L Walters (M.L.Walters@herts.ac.uk) Kerstin Dautenhahn (K.Dautenhahn@herts.ac.uk) René te
More informationA comparison of a genetic algorithm and a depth first search algorithm applied to Japanese nonograms
A comparison of a genetic algorithm and a depth first search algorithm applied to Japanese nonograms Wouter Wiggers Faculty of EECMS, University of Twente w.a.wiggers@student.utwente.nl ABSTRACT In this
More informationBody Movement Analysis of Human-Robot Interaction
Body Movement Analysis of Human-Robot Interaction Takayuki Kanda, Hiroshi Ishiguro, Michita Imai, and Tetsuo Ono ATR Intelligent Robotics & Communication Laboratories 2-2-2 Hikaridai, Seika-cho, Soraku-gun,
More informationNavigation in the Presence of Humans
Proceedings of 2005 5th IEEE-RAS International Conference on Humanoid Robots Navigation in the Presence of Humans E. A. Sisbot, R. Alami and T. Simeon Robotics and Artificial Intelligence Group LAAS/CNRS
More informationTHE HRI EXPERIMENT FRAMEWORK FOR DESIGNERS
THE HRI EXPERIMENT FRAMEWORK FOR DESIGNERS Kwangmyung Oh¹ and Myungsuk Kim¹ ¹Dept. of Industrial Design, N8, KAIST, Daejeon, Republic of Korea, urigella, mskim@kaist.ac.kr ABSTRACT: In the robot development,
More informationAn Integrated Expert User with End User in Technology Acceptance Model for Actual Evaluation
Computer and Information Science; Vol. 9, No. 1; 2016 ISSN 1913-8989 E-ISSN 1913-8997 Published by Canadian Center of Science and Education An Integrated Expert User with End User in Technology Acceptance
More informationSIGVerse - A Simulation Platform for Human-Robot Interaction Jeffrey Too Chuan TAN and Tetsunari INAMURA National Institute of Informatics, Japan The
SIGVerse - A Simulation Platform for Human-Robot Interaction Jeffrey Too Chuan TAN and Tetsunari INAMURA National Institute of Informatics, Japan The 29 th Annual Conference of The Robotics Society of
More informationUsing Variability Modeling Principles to Capture Architectural Knowledge
Using Variability Modeling Principles to Capture Architectural Knowledge Marco Sinnema University of Groningen PO Box 800 9700 AV Groningen The Netherlands +31503637125 m.sinnema@rug.nl Jan Salvador van
More informationDevelopment of an Interactive Humanoid Robot Robovie - An interdisciplinary research approach between cognitive science and robotics -
Development of an Interactive Humanoid Robot Robovie - An interdisciplinary research approach between cognitive science and robotics - Hiroshi Ishiguro 1,2, Tetsuo Ono 1, Michita Imai 1, Takayuki Kanda
More informationEarly Take-Over Preparation in Stereoscopic 3D
Adjunct Proceedings of the 10th International ACM Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI 18), September 23 25, 2018, Toronto, Canada. Early Take-Over
More informationImplications on Humanoid Robots in Pedagogical Applications from Cross-Cultural Analysis between Japan, Korea, and the USA
Implications on Humanoid Robots in Pedagogical Applications from Cross-Cultural Analysis between Japan, Korea, and the USA Tatsuya Nomura,, No Member, Takayuki Kanda, Member, IEEE, Tomohiro Suzuki, No
More informationSven Wachsmuth Bielefeld University
& CITEC Central Lab Facilities Performance Assessment and System Design in Human Robot Interaction Sven Wachsmuth Bielefeld University May, 2011 & CITEC Central Lab Facilities What are the Flops of cognitive
More informationRELATED WORK Gaze model Gaze behaviors in human-robot interaction have been broadly evaluated: turn-taking [6], joint attention [7], influences toward
Can a Social Robot Help Children s Understanding of Science in Classrooms? Tsuyoshi Komatsubara, Masahiro Shiomi, Takayuki Kanda, Hiroshi Ishiguro, Norihiro Hagita ATR Intelligent Robotics and Communication
More informationAttracting Human Attention Using Robotic Facial. Expressions and Gestures
Attracting Human Attention Using Robotic Facial Expressions and Gestures Venus Yu March 16, 2017 Abstract Robots will soon interact with humans in settings outside of a lab. Since it will be likely that
More informationNon Verbal Communication of Emotions in Social Robots
Non Verbal Communication of Emotions in Social Robots Aryel Beck Supervisor: Prof. Nadia Thalmann BeingThere Centre, Institute for Media Innovation, Nanyang Technological University, Singapore INTRODUCTION
More informationA receptionist robot for Brazilian people: study on interaction involving illiterates
Paladyn, J. Behav. Robot. 2017; 8:1 17 Research Article Open Access Gabriele Trovato*, Josue G. Ramos, Helio Azevedo, Artemis Moroni, Silvia Magossi, Reid Simmons, Hiroyuki Ishii, and Atsuo Takanishi A
More informationNavigation Styles in QuickTime VR Scenes
Navigation Styles in QuickTime VR Scenes Christoph Bartneck Department of Industrial Design Eindhoven University of Technology Den Dolech 2, 5600MB Eindhoven, The Netherlands christoph@bartneck.de Abstract.
More informationDag Sverre Syrdal, Kerstin Dautenhahn, Michael L. Walters and Kheng Lee Koay
Sharing Spaces with Robots in a Home Scenario Anthropomorphic Attributions and their Effect on Proxemic Expectations and Evaluations in a Live HRI Trial Dag Sverre Syrdal, Kerstin Dautenhahn, Michael L.
More informationUne Méthodologie pour Evaluer l Acceptabilité de la Collaboration Homme-Robot en utilisant la Réalité Virtuelle
Une Méthodologie pour Evaluer l Acceptabilité de la Collaboration Homme-Robot en utilisant la Réalité Virtuelle Vincent Weistroffer, Alexis Paljic, Lucile Callebert, Philippe Fuchs To cite this version:
More informationThe Role of Dialog in Human Robot Interaction
MITSUBISHI ELECTRIC RESEARCH LABORATORIES http://www.merl.com The Role of Dialog in Human Robot Interaction Candace L. Sidner, Christopher Lee and Neal Lesh TR2003-63 June 2003 Abstract This paper reports
More informationA Design Platform for Emotion-Aware User Interfaces
A Design Platform for Emotion-Aware User Interfaces Eunjung Lee, Gyu-Wan Kim Department of Computer Science Kyonggi University Suwon, South Korea 82-31-249-9671 {ejlee,kkw5240}@kyonggi.ac.kr Byung-Soo
More informationComparing a Social Robot and a Mobile Application for Movie Recommendation: A Pilot Study
Comparing a Social Robot and a Mobile Application for Movie Recommendation: A Pilot Study Francesco Cervone, Valentina Sica, Mariacarla Staffa, Anna Tamburro, Silvia Rossi Dipartimento di Ingegneria Elettrica
More informationMotor Interference and Behaviour Adaptation in Human-Humanoid Interactions. Qiming Shen. Doctor of Philosophy
Motor Interference and Behaviour Adaptation in Human-Humanoid Interactions Qiming Shen A thesis submitted in partial fulfilment of the requirements of the University of Hertfordshire for the degree of
More informationEvaluation of a Tricycle-style Teleoperational Interface for Children: a Comparative Experiment with a Video Game Controller
2012 IEEE RO-MAN: The 21st IEEE International Symposium on Robot and Human Interactive Communication. September 9-13, 2012. Paris, France. Evaluation of a Tricycle-style Teleoperational Interface for Children:
More informationUnderstanding User Privacy in Internet of Things Environments IEEE WORLD FORUM ON INTERNET OF THINGS / 30
Understanding User Privacy in Internet of Things Environments HOSUB LEE AND ALFRED KOBSA DONALD BREN SCHOOL OF INFORMATION AND COMPUTER SCIENCES UNIVERSITY OF CALIFORNIA, IRVINE 2016-12-13 IEEE WORLD FORUM
More informationTowards Intuitive Industrial Human-Robot Collaboration
Towards Intuitive Industrial Human-Robot Collaboration System Design and Future Directions Ferdinand Fuhrmann, Wolfgang Weiß, Lucas Paletta, Bernhard Reiterer, Andreas Schlotzhauer, Mathias Brandstötter
More informationUser requirements for wearable smart textiles. Does the usage context matter (medical vs. sports)?
User requirements for wearable smart textiles. Does the usage context matter (medical vs. sports)? Julia van Heek 1, Anne Kathrin Schaar 1, Bianka Trevisan 2, Patrycja Bosowski 3, Martina Ziefle 1 1 Communication
More informationWhen in Rome: The Role of Culture & Context in Adherence to Robot Recommendations
When in Rome: The Role of Culture & Context in Adherence to Robot Recommendations Lin Wang & Pei- Luen (Patrick) Rau Benjamin Robinson & Pamela Hinds Vanessa Evers Funded by grants from the Specialized
More informationCan a social robot train itself just by observing human interactions?
Can a social robot train itself just by observing human interactions? Dylan F. Glas, Phoebe Liu, Takayuki Kanda, Member, IEEE, Hiroshi Ishiguro, Senior Member, IEEE Abstract In HRI research, game simulations
More informationAvoiding the Uncanny Valley Robot Appearance, Personality and Consistency of Behavior in an Attention-Seeking Home Scenario for a Robot Companion
Avoiding the Uncanny Valley Robot Appearance, Personality and Consistency of Behavior in an Attention-Seeking Home Scenario for a Robot Companion Michael L. Walters, Dag S. Syrdal, Kerstin Dautenhahn,
More informationModeling Human-Robot Interaction for Intelligent Mobile Robotics
Modeling Human-Robot Interaction for Intelligent Mobile Robotics Tamara E. Rogers, Jian Peng, and Saleh Zein-Sabatto College of Engineering, Technology, and Computer Science Tennessee State University
More informationCan robots become social companions?
Can robots become social companions? Humans are experts at social interaction and use this expertise in many situations. This has inspired work to develop social robots which can also utilise these abilities
More informationHuman Autonomous Vehicles Interactions: An Interdisciplinary Approach
Human Autonomous Vehicles Interactions: An Interdisciplinary Approach X. Jessie Yang xijyang@umich.edu Dawn Tilbury tilbury@umich.edu Anuj K. Pradhan Transportation Research Institute anujkp@umich.edu
More informationThe Interaction Between Voice and Appearance in the Embodiment of a Robot Tutor
The Interaction Between Voice and Appearance in the Embodiment of a Robot Tutor Helen Hastie 1, Katrin Lohan 1, Amol Deshmukh 2, Frank Broz 1 and Ruth Aylett 1 1 School of Mathematical and Computer Science,
More informationGrant agreement no: FP SPENCER: Project start: April 1, 2013 Duration: 3 years XXXXXXXXXXDELIVERABLE 4.1XXXXXXXXXX
Grant agreement no: FP7-600877 SPENCER: Social situation-aware perception and action for cognitive robots Project start: April 1, 2013 Duration: 3 years XXXXXXXXXXDELIVERABLE 4.1XXXXXXXXXX Behavior evaluation
More informationWith a New Helper Comes New Tasks
With a New Helper Comes New Tasks Mixed-Initiative Interaction for Robot-Assisted Shopping Anders Green 1 Helge Hüttenrauch 1 Cristian Bogdan 1 Kerstin Severinson Eklundh 1 1 School of Computer Science
More information