Using proprioceptive sensors for categorizing interactions

Size: px
Start display at page:

Download "Using proprioceptive sensors for categorizing interactions"

Transcription

1 Using proprioceptive sensors for categorizing interactions [Extended Abstract] T Salter, F Michaud and D Létourneau Université de Sherbrooke Sherbrooke Quebec, Canada t.salter f.michaud ABSTRACT Increasingly researchers are looking outside of normal communication channels (such as video and audio) to provide additional forms of communication or interaction between a human and a robot, or a robot and its environment. Amongst the new channels being investigated are infrared, proprioceptive and temperature sensors to detect touch. Our work aims at developing a system that can detect natural touch or interaction coming from children playing with a robot, and adapt to this interaction. This paper reports trials done using Roball, a spherical mobile robot, demonstrating how sensory data patterns can be identified in human-robot interaction, and exploited for achieving behavioral adaptation. The experimental methodology used for these trials is reported, which validated the hypothesis that human interaction can not only be perceived from proprioceptive sensors on-board a robotic platform, but that this perception has the ability to lead to adaptation. General Terms Algorithms, Performance, Design, Experimentation, Human Factors. Keywords Human-Robot Interaction (HRI), Adaptive Mobile Robots, Sensor Evaluation, Categorizing Interaction. (Produces the permission block, copyright information and page numbering) A full version of this paper is available as Author s Guide to Preparing ACM SIG Proceedings Using L A TEX2 ɛ and BibTeX at Also affiliated with University of Hertfordshire D.C. Lee and I.P. Werry University of Hertfordshire Hatfield Hertfordshire, England d.c.lee 1. INTRODUCTION Touch is an important form of communication or interaction between robots and humans [16], [17] or between robots and their environment [6]. While video and audio are typically used for communication and interaction between humans and robots, other sensors such as contact, infrared, proprioceptive and temperature sensors can provide additional means of communication related to touch [6] and [13]. Our main aim is to develop a system where children can interact with a robot naturally and the robot can adapt to this natural interaction. How children interact with and perceive robots is itself becoming a highly studied area [3], [14], [23], [5], [22]. People working with children or in therapy are beginning to recognize that natural touch is an important form of interaction or communication with a robot [3], [21], [16], [17], [2]. In an effort to register touch or communication, some robotic systems utilize buttons that must be pushed by a person [4], [1], [19]. Salter et al. [16], [17], [15] showed that infrared sensors on-board a mobile robot, typically exploited for navigation purposes, can also be used to record interactions or natural touch coming from children playing with a mobile robot. The research demonstrates that it is even possible to detect personality traits (e.g., boisterous, or cautious) of a child interacting with a wheeled robot, simply from the analysis of infrared sensor data. At MIT [20], a robotic teddy bear named Huggable is being designed with full-body sensate skin and smooth, quiet voice coil actuators that are able to relate to people through touch. Huggable features a series of temperature, electric field and force sensors which it uses to sense the interactions that people have with it [12]. At NASA, Lumelsky [8] is working on developing a sensitive skin that could be used to cover a robot. The skin would include more than 1,000 infrared sensors that would detect an object [7]. In their related work, Saraf and Maheshwari [9] claim that their device can give a robot tactile sensitivity equivalent to that of human fingers, and one early use might be in minimally invasive surgery. The system they have developed is based on alternating layers of gold and semi-conducting cadmiumsulfur nanoparticles separated by nonconducting, or dielectric, films. They hope to coat a robot hand with this film. Another move away from the typical use of microswitches for detecting touch is realized in the robot seal named Paro, which was developed for

2 Figure 2: Both front or back and side view. Figure 1: Roball, an autonomous rolling robot. robot assisted activity in hospitals or homes for the elderly. In this system, physical contact with the robot, for example touching, is recognized by a system based on balloons [21]. Our interests lie in studying how robots proprioceptive sensors (which can be used for navigation, control or other purposes) can be exploited as a form of communication and as way to capture natural touch or interaction between a mobile robot and children. First, a series of trials were conducted using Roball (see Figure 1) in laboratory conditions without children and then in real life settings with children. Data analysis of accelerometers and tilt sensors established that it is possible to detect play patterns with accelerometers and tilt sensors. Second, another series of trials were designed to follow on from and expand upon this work, pursuing the goal of adapting the robots behavioral response to the stimulation or interaction it is receiving from children playing with it. The trials were conducted to test whether heuristics identified in the previous trials could be utilized to enable adaptation of Roball to human interaction. Again trials were conducted using Roball (see Figure 1) both in the laboratory without children and in a real life setting with children. Our findings are presented in this paper. 2. ROBALL Shown in Figure 1, Roball is 6 inches in diameter and weighs about 4 pounds [10], [11]. It consists of a plastic sphere (a hamster exercise ball) constructed from two halves that are attached to each other. The plastic sphere is used to house the fragile electronics (sensors, actuators, processing elements), thus making it robust and ideal for interaction with children. The fact that Roball is spherical encourages a wide range of play situations. Movement is achieved through a combination of two propulsion motors that are used to propel the shell of Roball, and a counterweight that is used to control direction by moving the center of gravity towards Figure 3: Roball s accelerometers, centered above the steering motor. the required direction, see Figure 2. The propulsion motors are attached to the shell wall. Rotation of these motors causes Roball to move forward or backwards. The steering motor moves the counterweight from one side to the other in order to move the center of gravity away from the center of the sphere. Roballs first prototype used binary mercury switches to detect and control lateral and longitudinal inclinations of the internal plateau. To provide a wider set of perceptual states, new sensors were installed on the platform. The version of Roball used in our work, running with a PIC18F458 microcontroller with 32 kbytes of internal memory, 1.5 kbytes of RAM and operating at 40 MHz (10 MIPS), has the following special features: Accelerometers - Roball has three accelerometers, one for each axis (X, Y and Z). Analog ADXL311 miniature accelerometers are used to measure Roballs acceleration in any of the three axes directions in the range 0 to 2g (see Figure 3). Tilt sensors - There are three tilt sensors, one for left tilt, one for right and one for forward/backward tilt. Sharp GP1S036HEZ miniature photointerrupters are used to detect the tilt direction by sensing the movement of a small ball. The tilt sensors are positioned on Roballs printed circuit board. Two tilt sensors are placed symmetrically on left/right axis (the axis cor-

3 Figure 5: Examples of the experiments in the laboratory. The robot executed a wandering and a simple obstacle avoidance behaviour. Figure 4: Roball s tilt sensors: (a) placement; (b) possible values with L = Left, R = Right, B = Back and F = Front; (c) left tilt being registered. responding to the line between Roballs two propulsion motors). This configuration allows the detection of either left or right tilt with both sensors giving the same value, and also allows detection of rotation with readings from the sensors giving opposite left/right tilt values due to centrifugal acceleration. For example, if the ball is tilted to the left, both the right and the left tilt sensors give a reading of (L). If the ball is tilted to the right, both the right and left tilt sensor gives a reading of (R). Finally, if the ball is spinning the right and left tilt sensors give opposite readings: the right sensor gives a reading of (R) and the left sensor gives a reading of (L) (see Figure 4). 3. EXPERIMENTAL SETTINGS In both sets of trials (First and Second), the robot was programmed to execute two simple behaviors: wandering and obstacle avoidance. These behaviors were carried out for the duration of each trial. Roball s function was to act as a mobile, moving toy. Four small wooden walls enclose the experimental arena, which creates a pen as shown in Figures 5 and 6. The pen is approximately 2.5 m by 2 m. Every experiment was video taped for verification of sensor readings. Sensor readings were recorded on-board the robot every 0.10 second. After the preprogrammed duration of the trial the robot stops by itself, ending the trial. All of the laboratory experiments took place at the Université de Sherbrooke, without children present. In the real life settings, the children were typically developing boys aged between 5 and 7 years. All the children were treated the same. None of the children had been exposed to robotic devices other than those found in toy stores. When the trial begins they are asked to step inside the pen and to play Figure 6: The children playing with Roball in a home setting. Roball is programmed with adaptive behaviours. with Roball. The experimenter attempts to have as little effect on the children as possible. Such as giving the same set of instructions to all the children, also not engaging in conversation with children and not prompting the children to play with the robot once the trial has commenced. 4. FIRST TRIALS: OBTAINING HEURIS- TICS The first set of trials involved: a series of laboratory experiments then a series of trials held in real life settings, namely, a play group and a school. 4.1 Laboratory Experiments These were used to investigate whether measurements from two different types of proprioceptive sensors could record things such as jolts to the robot, the robot receiving general interaction, the robot being carried or the robot being spun. The experiments were broken down into seven environmental conditions: 1. Alone (i) Roball wandering in the laboratory. 2. Alone (ii) Roball wandering in the experimental pen. 3. Light Boxes Roball wandering in the pen with boxes present. 4. Heavy Boxes - Roball wandering in the pen with weighted boxes present. 5. Carrying Experimenter walking whilst carrying Roball.

4 Figure 7: The 3 different axes reading for each of the environmental conditions. Figure 8: Averaged differences between the X and Z accelerometer readings. 6. Interaction Experimenter simulating interaction e.g. kicking, pushing, banging Roball. 7. Spinning Experimenter spinning Roball. Data analysis was performed after the experiments had been conducted. Analysis was conducted by using simple calculations on the data collected from the accelerometers and tilt sensor during these seven different environmental conditions. The data comes from an average of the 3 experiments of 5 minutes each, repeated for each of the seven environmental conditions. Data is being recorded at 10 Hz Accelerometer Results Figure 7 shows that during the seven different environmental, each of the accelerometers produces a different (average) signature. Although there are similarities between the signatures, each is unique in some way. Figure 8 illustrates that the difference between the X accelerometer and the Z accelerometer (X Z) also produced very interesting results that could be used to discriminate the robots interaction status. For instance, only in condition (6) Carrying do we see a negative X-Z difference; also, condition (1) Alone (i) shows the largest difference between these two axes. Line graphs produced of the trials clearly show a difference in the way human contact with robot is registered by accelerometers to what these sensors register when the robot is alone (see Figures 9 and 10). These graphs show a marked difference in sensor readings and identify that there definitely is the possibility of using these readings to correctly classify human interaction and thus have the ability to adapt to this information. When the robot has experienced interaction we see all three axes readings show jagged readings that constantly cross with each other. When the robot is, alone, we see large gaps between each of the axis Tilt Sensor Results Data recorded from the tilt sensors readings during each of the When the robot is spinning, two tilt sensors should give different values as described in Section 2 (see Figure Figure 9: The erratic X, Y and Z axis sensor data from the accelerometers when Roball is in experimental condition (6) Interaction. Figure 10: Accelerometer data when Roball is in experimental condition (1) Alone(i).

5 Figure 11: Tilt sensor readings for the seven environmental conditions. 4). From the results shown in Figure 11, we can see that condition (7) Spinning, as expected, produces the highest results. 4.2 Real Life Settings Two real life setting were used to confirm that the data found under laboratory conditions would also be found in the real world. A play group and a school setting were used. In total eight boys participated in the trial, each playing with Roball at least twice. Trials were initially conducted for 5 minutes but this seemed to long for the children to hold their attention span and so the length of the trials were shortened to 4 minutes. Similar results were obtained from the analysis of accelerometers, tilt sensors and line graphs from the real life settings to those found in the lab. As with the trials conducted in the laboratory we found that Cautious children that did not play with Roball very much receive the largest difference between the X - Z axes, see Figure 8 for the results from the laboratory. We get the highest number of different tilt sensor readings for active children that spin Roball, see Figure 11 for the results from the laboratory. When a child plays with Roball we observe that interaction can be seen as jagged lines. When the child did not play with Roball we see gaps between the different axes (see Figure 12). 4.3 FIRST TRIAL: Conclusions This first set of trials showed that it is possible to detect different environmental conditions through the analysis of proprioceptive sensors (accelerometers and tilt) [18]. Overall, these analyses indicate that different environmental conditions, which can be associated with forms of interaction, can be detected through the analysis of proprioceptive sensor data. The detection that the robot is being carried is the easiest, followed by the detection that the robot is being spun. The detection of general interaction with a person is not so easy, but still possible, however it is easier to detect when the robot is alone, or not receiving any type of interaction from a person. To categorize between these states, Figure 12: Accelerometer readings when a child was interacting with Roball in a real life setting. the sensor reading space can be zoned into regions. The objective is to achieve this categorization using an on-board feature detection algorithm, in real-time, and to adapt the robots behavior to the interaction experienced. 5. HEURISTICS FOR BEHAVIORAL ADAP- TATION It was found in the first trial that it is possible to detect different environmental conditions. More specifically, related to interaction with people, accelerometer and tilt readings that can be classified into zones which detect four modes of interaction.: ALONE, GENERAL INTERACTION, CAR- RYING and SPINNING. Another condition, named NO CON- DITION, is necessary for situations not part of the other four. By detecting these states, the robots behavior could be changed or adapted to respond in a particular fashion to interaction with children. The algorithm developed uses the following five rules which are specific instantiations of the heuristics derived from the analysis presented in Section 4 and in [18]. A Being Alone. If the average difference between the X and Z accelerometer readings is above 0.05, set current mode to ALONE. B Receiving General Interaction. If the average difference between the X and Z accelerometer readings is below 0.03 and above zero, set current condition to GENERAL INTERACTION. C Being Carried. If the average difference between the X and Z accelerometer readings is negative, set current condition to CARRYING. D Being Spun. If the tilt sensors show different readings (see Figure 11), set condition to SPINNING. Another way to detect spinning is if the average reading for the Z axis is positive and coupled with an average Y axis reading of above 0.05 (see Figure 7). E No Condition. If the sensor readings do not fall into one of the above categories, set the condition to NO CONDITION.

6 Different from the first trial, the algorithm uses a temporal window of 4 seconds to calculate an average of the sensor readings and thus derive which condition it believes the robot is currently experiencing. This window is moved forward in time by 0.10 sec increments. After examining the heuristics it was decided to develop an algorithm that detected conditions based on the ease with which it was believed they were detectable i.e. checking for conditions that are easy to classify first. The final flow of the algorithm was determined by various tests conducted in the laboratory. Tests were conducted that checked the ability of the algorithm to detect the differing conditions until the optimal flow for the algorithm was achieved. The algorithm is designed to first attempt to detect the condition SPINNING, by looking at the difference in the tilt sensor readings. If this condition is found to be true, no analysis of the accelerometers is carried out. However, if this is found to be false, the analysis of the accelerometers is then performed. For this analysis, the condition CAR- RYING is first checked by looking for a negative (X Z) average. Next, the condition GENERAL INTERACTION is looked for by seeing if the Z axis is below zero. Then, the condition SPINNING is analysed by checking for a positive Z axis coupled with an average Y axis reading of above the condition ALONE is then identified if the X - Z average is above Next, the condition GENERAL IN- TERACTION is classified if the X - Z average is below 0.03 but above zero. If, after all this, the current average within the four second window does not fall into any of the categorises, NO CONDITION is set as the output. The resulting pseudo-code of the algorithm is: Being Carried The experimenter walked whilst carrying Roball for the duration of the experiment. Receiving General Interaction The experimenter simulated interaction from a child, for example, pushing, banging and getting in the way of the robot. Being Spun The experimenter purposely spun the robot for the duration of the experiment. Three separate trials were conducted for each of the four conditions and each individual trial lasted again for a duration of 4 minutes. Thus, in total, 12 experiments were carried out, lasting a total of 48 minutes. This resulted in 2360 interaction classifications (10 per second for 240 seconds less 40 for the four seconds in which the algorithm is initializing i ts temporal average window). The adaptation algorithm was implemented on-board Roball, and the identified states were recorded as interaction occurred. Table 1 presents the observed results of the identified states (A = ALONE, B = GENERAL INTERACTION, C = CARRY- ING, D = SPINNING, E = NO CONDITION) in relation to the four modes of interaction Algorithm Results The results represent the percentage of time that the state was identified during the trial. The objective is to maximize valid identification and minimize false detection. As can be seen from the leading diagonal of Table 1, the robot can identify the following with reasonable accuracy: IF tilt sensors are different output SPINNING ELSE %Check accelerometers IF x-z average < 0 output CARRYING ELSE IF z average < 0 output GENERAL INTERACTION ELSE IF z average > 0 && y average > 0.05 output SPINNING ELSE IF x-z average > 0.05 output ALONE ELSE IF x-z average > 0 && x-z average < 0.03 output GENERAL INTERACTION ELSE output NO CONDITION END IF END IF 6. SECOND TRIALS: BEHAVIORAL ADAP- TATION A second series of trials were conducted that mimicked those carried out in the first set of trials. Again trials were conducted in the laboratory and a real life setting. 6.1 Laboratory Experiments The experiments were broken down into the four modes of interaction listed below: Being Alone - Roball wandered in the pen by itself, no objects or humans present. Being Alone (97%) Being Carried (92%) Being Spun (77%) However, identifying Receiving General Interaction (10%) is revealed to be more difficult. A probable cause for this is that at times the robot is in fact SPINNING or experiencing ALONE during the General Interaction trials. Such conditions would therefore be identified under the corresponding categories, (D) SPINNING 45% and (A) ALONE 19% of the time. Therefore, adding the results for conditions (A) ALONE, (B) GENERAL IN- TERACTION and (D) SPINNING, a total of 74% classifications were correctly identified by the algorithm during the GENERAL INTERACTION experiment. It should be noted that the experimenters simulation of general interaction was fairly vigorous. For example, the experimenter pushed and kicked the robot with quite some force, which did cause the robot to spin. This is not always the case with children, as can be seen from the line graph of a child s interaction, see Figure 12. In general, from observations of the first set of trials, children seem to interact with the robot in a more punctuated manner, so that a general interaction is rather rare. For example, they may push the

7 robot then wait, or spin the robot but not then straight away push and kick the robot. Therefore, it is believed that recording the varying interaction may possibly be easier in a real life setting than in the laboratory. The misclassification of the condition CARRYING whilst receiving general interaction as shown in Table 1 requires further discussion. It was discovered that when the robot hits the wall of the pen, it actually records the same readings as being carried. This is because as the robot hits the pen wall, it slightly rolls up the wall and this causes similar accelerometer readings as when the robot is picked up. This therefore helps to explain why when the robot is Receiving General Interaction, the condition CARRYING was registered 19% of the time and similarly when the robot was Being Spun, the condition CARRYING was registered 17%. During both trials the robot did roll up the wall of the pen. Where as, when the robot was Alone it did not roll up the wall thus we see the condition CARRYING recorded 0.5%. The problem of the robot rolling up the wall was not considered significant as the pen was only being used specifically in this research work. In other real life settings the robot would not be so confined and therefore this phenomena would not happen so often. *** 6.2 Adding Adaptation and Testing in a Real Life Setting Three adaptive behaviors were added to the robot: two behaviors involved vocals and one behavior involved motion coupled with vocals. 1. When the robot classifies its sensor readings as SPIN- NING the robot produces the sound: weeeeeeeeeeeeee. 2. When the robot classifies its sensor readings as CAR- RYING it stops all motion and says put me down. 3. When the robot classifies its sensor readings as ALONE it says play with me. Both the sounds and the movement response are repeated until the condition that initiated the behavior changes. Audible responses are easily perceived by an external observer, which facilitates post-experiment analysis compared to a change in the behavioral control of the robot. In this work it was decided to begin simply with limited adaptation for clear and precise evaluation of the algorithms performance in the child-robot trials. The experiments in a real life setting took place at a house, as shown in Figure 6. All the children were known to the experimenter in a social context. Different to the first trial, the children were told two things about the robot: 1) the robot could be spun; 2) they could pick the robot up but they must not then drop it and that they should put the robot gently down on the floor. Each trial with the children lasted for four minutes. From the preliminary results observed with children interacting with the robot in trial, we clearly observe that the robot did react to the children and that when the robot did react there was an increased level of involvement by the children. However, it was noticed that at times the robot did not react correctly. One clear case is the identification of Carrying when it hit the wall of the pen, causing the robot to stop all motion and say Put me down. Interestingly, as a side effect, this caused an even higher level of engagement and interaction from the children: e.g., having the child looking at the experimenter and saying it is asking me to put it down and then proceeding to aid the robot by moving it so that it could progress on its way. Overall, the response of the children was very encouraging. It shows that in principle that is possible to adapt the behavior of a robot using sensor readings. Evident was that there was a increase in the level of the childrens engagement and interaction in second trial compared to that of the first trial. It seems that this was due to a more complex behavior. Increasing levels of engagement and interaction with children, which was the ultimate goal of the robot. A further and detailed analysis of the child-robot study needs to done. Also we need to conduct further trials with a larger sample size of children and we need to fully investigate the reasons for any incorrect categorization of the algorithm, which ultimately leads to incorrect reaction of the robot. 7. CONCLUSIONS This work demonstrates that proprioceptive sensors are capable of detecting and recording human interaction. Through the analysis of accelerometers and tilt sensors (proprioceptive sensors) on the Roball s platform, it is found that that is possible to detect the robot being carried, being spun, being alone and also interacting with a person. The analysis required is simple, work in real-time on a small embedded microcontroller, and does not require complex processes. Carrying out preliminary trials in the laboratory without children helped in providing base line and bench marks readings for successive child-robot trials. Further work will include more detailed analysis of the adaptation algorithm used with Roball interacting with children. Also planned is the use of the adaptation algorithm to change the robots navigational behavior, and see the effects on engagement and interaction with children. 8. ACKNOWLEDGMENTS F. Michaud holds the Canada Research Chair (CRC) in Mobile Robotics and Autonomous Intelligent Systems. This work is funded by the CRC and the Canadian Foundation for Innovation. 9. REFERENCES [1] A. Billard. Robota: Clever toy and educational tool. Robotics and Autonomous Systems, 42: , [2] A. Duquette, H. Mercier, and F. Michaud. Investigating the use of a mobile robotic toy as an imitation agent for children with autism. In International Conference on Epigenetic Robotics, Paris, France, [3] T. Ito and N. P. R. Center. How children perceive robots. univ/05/univ e05.html, last accessed 06/10/04, 2003.

8 Table 1: The rows of the table correspond to the four tested environmental conditions. The columns of the table correspond to the five categories derived from the sensor readings, A = ALONE, B = INTERACTION, C = CARRYING, D = SPINNING, E = NO CONDITION. The percentage of correctly classified interaction is therefore shown in the leading diagonal. For example, the robot being alone is correctly classified 97% of time. A B C D E Being Alone 97% 0.5% 0.5% 1% 1% Receiving General Interaction 19% 10% 19% 45% 7% Being Carried 0.5% 3% 92% 3% 1.5% Being Spun 2% 3% 17% 77% 1% [4] B. Jensen, G. Froidevaux, X. Greppin, A. Lorotte, L. Mayor, M. Meisser, G. Ramel, and R. Siegwart. The interactive autonomous mobile system robox. In IROS 2002, IEEE/RSJ International Conference on Intelligent Robots and Systems, pages IEEE Press, [5] T. Kanda, T. Hirano, D. Eaton, and H. Ishiguro. Interactive robots as social partners and peer tutors for children: A field trial. Human-Computer Interaction, 19:61 84, [6] O. Kerpa, K. Weiss, and H. Worn. Development of a flexible tactile sensor system for a humanoid robot. In IEEE International conference on Intelligent Robots and Systems (IROS 2003). IEEE Press, [7] V. Lumelsky. NASA - vladskin.html. Last accessed [8] V. Lumelsky, M. Shur, and S. Wagner. Sensitive skin. IEEE Sensors Journal, 1, [9] V. Maheshwari and R. Saraf. High-resolution thin-film device to sense texture by touch. Science, pages , June [10] F. Michaud and S. Caron. Roball, the rolling robot. Autonomous Robots, 12(2): , [11] F. Michaud, J.-F. Laplante, H. Larouche, A. Duquette, S. Caron, D. Létourneau, and P. Masson. Autonomous spherical mobile robot for child-development studies. Systems, Man, and Cybernetics, 35: , [12] MIT. ResearchPubWeb.pl?ID=53, Last accessed 24/08/06. [13] T. Miyashita, T. Tajika, H. Ishigurio, K. Kogrue, and N. Hagita. Haptic communication between humans and robots. In 12th International Symposium of Robotics Research, San Francisco, CA, USA, [14] B. Robins, K. Dautenhahn, R. te Boekhorst, and A. Billard. Effects of repeated exposure of a humanoid robot on children with autism. Universal Access and Assistive Technology (CWUAAT), pages , [15] T. Salter and K. Dautenhahn. Guidelines for robot-human environments in therapy. In IEEE Ro-man 2004, 13th IEEE International Workshop on Robot and Human Interactive Communication, pages 41 46, Kurashiki, Okayama, Japan, IEEE Press. [16] T. Salter, K. Dautenhahn, and R. te Boekhorst. Learning about natural human-robot interaction. Robotics and Autonomous Systems, 54(2): , [17] T. Salter, K. Dautenhahn, and R. te Boekhorst. Robots moving out of the laboratory - detecting interaction levels and human contact in noisy school environments. In IEEE Ro-man 2004, 13th IEEE International Workshop on Robot and Human Interactive Communication, pages , Kurashiki, Okayama, Japan, IEEE Press. [18] T. Salter, F. Michaud, K. Dautenhahn, D. Létourneau, and S. Caron. Recognizing interaction from a robot s perspective. In RO-MAN 05, 14th IEEE International Workshop on Robot and Human Interactive Communication, Nashville, TN, USA, [19] Sony. lasted accessed 06/10/04. [20] W. D. Stiehl, J. Lieberman, C. Breazeal, L. Basel, L. Lalla, and M. Wolf. The design of the huggable: A therapeutic robotic companion for relational, affective touch. In AAAI Fall Symposium on Caring Machines: AI in Eldercare, Washington, D.C., [21] D. Wada, T. Shibata, T. Saito, and K. Tanie. Robot assisted activity for elderly people and nurses at a day service center. In IEEE International Conference on Robotics and Automation, pages , Washington, DC, [22] T. Watanabe, R. Danbara, and M. Okubo. Interactor: Speech-driven embodied interactive actor. In IEEE Ro-man 2002, 11th International Workshop on Robot and Human Interactive Communication, pages , Berlin, Germany, IEEE Press. [23] S. Woods, K. Dautenhahn, and J. Schulz. Child and adults perspectives on robot appearance. In AISB 05 Symposium on Robot Companions: Hard Problems and Open Challenges in Robot-Human Interaction, Hertfordshire, England, UK, 2005.

Proceedings of th IEEE-RAS International Conference on Humanoid Robots ! # Adaptive Systems Research Group, School of Computer Science

Proceedings of th IEEE-RAS International Conference on Humanoid Robots ! # Adaptive Systems Research Group, School of Computer Science Proceedings of 2005 5th IEEE-RAS International Conference on Humanoid Robots! # Adaptive Systems Research Group, School of Computer Science Abstract - A relatively unexplored question for human-robot social

More information

Empirical Results from Using a Comfort Level Device in Human-Robot Interaction Studies

Empirical Results from Using a Comfort Level Device in Human-Robot Interaction Studies Empirical Results from Using a Comfort Level Device in Human-Robot Interaction Studies K.L. Koay, K. Dautenhahn, S.N. Woods and M.L. Walters University of Hertfordshire School of Computer Science College

More information

Robotics for Children

Robotics for Children Vol. xx No. xx, pp.1 8, 200x 1 1 2 3 4 Robotics for Children New Directions in Child Education and Therapy Fumihide Tanaka 1,HidekiKozima 2, Shoji Itakura 3 and Kazuo Hiraki 4 Robotics intersects with

More information

Texture recognition using force sensitive resistors

Texture recognition using force sensitive resistors Texture recognition using force sensitive resistors SAYED, Muhammad, DIAZ GARCIA,, Jose Carlos and ALBOUL, Lyuba Available from Sheffield Hallam University Research

More information

Close Encounters: Spatial Distances between People and a Robot of Mechanistic Appearance *

Close Encounters: Spatial Distances between People and a Robot of Mechanistic Appearance * Close Encounters: Spatial Distances between People and a Robot of Mechanistic Appearance * Michael L Walters, Kerstin Dautenhahn, Kheng Lee Koay, Christina Kaouri, René te Boekhorst, Chrystopher Nehaniv,

More information

HAND-SHAPED INTERFACE FOR INTUITIVE HUMAN- ROBOT COMMUNICATION THROUGH HAPTIC MEDIA

HAND-SHAPED INTERFACE FOR INTUITIVE HUMAN- ROBOT COMMUNICATION THROUGH HAPTIC MEDIA HAND-SHAPED INTERFACE FOR INTUITIVE HUMAN- ROBOT COMMUNICATION THROUGH HAPTIC MEDIA RIKU HIKIJI AND SHUJI HASHIMOTO Department of Applied Physics, School of Science and Engineering, Waseda University 3-4-1

More information

Contents. Mental Commit Robot (Mental Calming Robot) Industrial Robots. In What Way are These Robots Intelligent. Video: Mental Commit Robots

Contents. Mental Commit Robot (Mental Calming Robot) Industrial Robots. In What Way are These Robots Intelligent. Video: Mental Commit Robots Human Robot Interaction for Psychological Enrichment Dr. Takanori Shibata Senior Research Scientist Intelligent Systems Institute National Institute of Advanced Industrial Science and Technology (AIST)

More information

Robots as Assistive Technology - Does Appearance Matter?

Robots as Assistive Technology - Does Appearance Matter? Proceedings of the 2004 IEEE International Workshop on Robot and Human Interactive Communication Kurashiki, Okayama Japan September 20-22,2004 Robots as Assistive Technology - Does Appearance Matter? Ben

More information

Evolving High-Dimensional, Adaptive Camera-Based Speed Sensors

Evolving High-Dimensional, Adaptive Camera-Based Speed Sensors In: M.H. Hamza (ed.), Proceedings of the 21st IASTED Conference on Applied Informatics, pp. 1278-128. Held February, 1-1, 2, Insbruck, Austria Evolving High-Dimensional, Adaptive Camera-Based Speed Sensors

More information

Measure simulated forces of impact on a human head, and test if forces are reduced by wearing a protective headgear.

Measure simulated forces of impact on a human head, and test if forces are reduced by wearing a protective headgear. PocketLab Science Fair Kit: Preventing Concussions and Head Injuries This STEM Science Fair Kit lets you be a scientist and simulate real world accidents and injuries with a crash test style dummy head.

More information

Evaluation of a Tricycle-style Teleoperational Interface for Children: a Comparative Experiment with a Video Game Controller

Evaluation of a Tricycle-style Teleoperational Interface for Children: a Comparative Experiment with a Video Game Controller 2012 IEEE RO-MAN: The 21st IEEE International Symposium on Robot and Human Interactive Communication. September 9-13, 2012. Paris, France. Evaluation of a Tricycle-style Teleoperational Interface for Children:

More information

Touch Perception and Emotional Appraisal for a Virtual Agent

Touch Perception and Emotional Appraisal for a Virtual Agent Touch Perception and Emotional Appraisal for a Virtual Agent Nhung Nguyen, Ipke Wachsmuth, Stefan Kopp Faculty of Technology University of Bielefeld 33594 Bielefeld Germany {nnguyen, ipke, skopp}@techfak.uni-bielefeld.de

More information

Ensuring the Safety of an Autonomous Robot in Interaction with Children

Ensuring the Safety of an Autonomous Robot in Interaction with Children Machine Learning in Robot Assisted Therapy Ensuring the Safety of an Autonomous Robot in Interaction with Children Challenges and Considerations Stefan Walke stefan.walke@tum.de SS 2018 Overview Physical

More information

Hey, I m over here How can a robot attract people s attention?

Hey, I m over here How can a robot attract people s attention? Hey, I m over here How can a robot attract people s attention? Markus Finke Neuroinformatics and Cognitive Robotics Group Faculty of Informatics and Automatization Technical University Ilmenau P.O.Box

More information

Objective Data Analysis for a PDA-Based Human-Robotic Interface*

Objective Data Analysis for a PDA-Based Human-Robotic Interface* Objective Data Analysis for a PDA-Based Human-Robotic Interface* Hande Kaymaz Keskinpala EECS Department Vanderbilt University Nashville, TN USA hande.kaymaz@vanderbilt.edu Abstract - This paper describes

More information

Essay on A Survey of Socially Interactive Robots Authors: Terrence Fong, Illah Nourbakhsh, Kerstin Dautenhahn Summarized by: Mehwish Alam

Essay on A Survey of Socially Interactive Robots Authors: Terrence Fong, Illah Nourbakhsh, Kerstin Dautenhahn Summarized by: Mehwish Alam 1 Introduction Essay on A Survey of Socially Interactive Robots Authors: Terrence Fong, Illah Nourbakhsh, Kerstin Dautenhahn Summarized by: Mehwish Alam 1.1 Social Robots: Definition: Social robots are

More information

DEVELOPMENT OF A HUMANOID ROBOT FOR EDUCATION AND OUTREACH. K. Kelly, D. B. MacManus, C. McGinn

DEVELOPMENT OF A HUMANOID ROBOT FOR EDUCATION AND OUTREACH. K. Kelly, D. B. MacManus, C. McGinn DEVELOPMENT OF A HUMANOID ROBOT FOR EDUCATION AND OUTREACH K. Kelly, D. B. MacManus, C. McGinn Department of Mechanical and Manufacturing Engineering, Trinity College, Dublin 2, Ireland. ABSTRACT Robots

More information

Dipartimento di Elettronica Informazione e Bioingegneria Robotics

Dipartimento di Elettronica Informazione e Bioingegneria Robotics Dipartimento di Elettronica Informazione e Bioingegneria Robotics Behavioral robotics @ 2014 Behaviorism behave is what organisms do Behaviorism is built on this assumption, and its goal is to promote

More information

I.1 Smart Machines. Unit Overview:

I.1 Smart Machines. Unit Overview: I Smart Machines I.1 Smart Machines Unit Overview: This unit introduces students to Sensors and Programming with VEX IQ. VEX IQ Sensors allow for autonomous and hybrid control of VEX IQ robots and other

More information

702. Investigation of attraction force and vibration of a slipper in a tactile device with electromagnet

702. Investigation of attraction force and vibration of a slipper in a tactile device with electromagnet 702. Investigation of attraction force and vibration of a slipper in a tactile device with electromagnet Arūnas Žvironas a, Marius Gudauskis b Kaunas University of Technology, Mechatronics Centre for Research,

More information

Lab 7: Introduction to Webots and Sensor Modeling

Lab 7: Introduction to Webots and Sensor Modeling Lab 7: Introduction to Webots and Sensor Modeling This laboratory requires the following software: Webots simulator C development tools (gcc, make, etc.) The laboratory duration is approximately two hours.

More information

Sensor system of a small biped entertainment robot

Sensor system of a small biped entertainment robot Advanced Robotics, Vol. 18, No. 10, pp. 1039 1052 (2004) VSP and Robotics Society of Japan 2004. Also available online - www.vsppub.com Sensor system of a small biped entertainment robot Short paper TATSUZO

More information

NCCT IEEE PROJECTS ADVANCED ROBOTICS SOLUTIONS. Latest Projects, in various Domains. Promise for the Best Projects

NCCT IEEE PROJECTS ADVANCED ROBOTICS SOLUTIONS. Latest Projects, in various Domains. Promise for the Best Projects NCCT Promise for the Best Projects IEEE PROJECTS in various Domains Latest Projects, 2009-2010 ADVANCED ROBOTICS SOLUTIONS EMBEDDED SYSTEM PROJECTS Microcontrollers VLSI DSP Matlab Robotics ADVANCED ROBOTICS

More information

* Intelli Robotic Wheel Chair for Specialty Operations & Physically Challenged

* Intelli Robotic Wheel Chair for Specialty Operations & Physically Challenged ADVANCED ROBOTICS SOLUTIONS * Intelli Mobile Robot for Multi Specialty Operations * Advanced Robotic Pick and Place Arm and Hand System * Automatic Color Sensing Robot using PC * AI Based Image Capturing

More information

VIP I-Natural Team. Report Submitted for VIP Innovation Competition April 26, Name Major Year Semesters. Justin Devenish EE Senior First

VIP I-Natural Team. Report Submitted for VIP Innovation Competition April 26, Name Major Year Semesters. Justin Devenish EE Senior First VIP I-Natural Team Report Submitted for VIP Innovation Competition April 26, 2011 Name Major Year Semesters Justin Devenish EE Senior First Khoadang Ho CS Junior First Tiffany Jernigan EE Senior First

More information

High-Level Programming for Industrial Robotics: using Gestures, Speech and Force Control

High-Level Programming for Industrial Robotics: using Gestures, Speech and Force Control High-Level Programming for Industrial Robotics: using Gestures, Speech and Force Control Pedro Neto, J. Norberto Pires, Member, IEEE Abstract Today, most industrial robots are programmed using the typical

More information

Incorporating a Connectionist Vision Module into a Fuzzy, Behavior-Based Robot Controller

Incorporating a Connectionist Vision Module into a Fuzzy, Behavior-Based Robot Controller From:MAICS-97 Proceedings. Copyright 1997, AAAI (www.aaai.org). All rights reserved. Incorporating a Connectionist Vision Module into a Fuzzy, Behavior-Based Robot Controller Douglas S. Blank and J. Oliver

More information

Adaptive Action Selection without Explicit Communication for Multi-robot Box-pushing

Adaptive Action Selection without Explicit Communication for Multi-robot Box-pushing Adaptive Action Selection without Explicit Communication for Multi-robot Box-pushing Seiji Yamada Jun ya Saito CISS, IGSSE, Tokyo Institute of Technology 4259 Nagatsuta, Midori, Yokohama 226-8502, JAPAN

More information

Robot to Human Approaches: Preliminary Results on Comfortable Distances and Preferences

Robot to Human Approaches: Preliminary Results on Comfortable Distances and Preferences Robot to Human Approaches: Preliminary Results on Comfortable Distances and Preferences Michael L. Walters, Kheng Lee Koay, Sarah N. Woods, Dag S. Syrdal, K. Dautenhahn Adaptive Systems Research Group,

More information

MULTI-LAYERED HYBRID ARCHITECTURE TO SOLVE COMPLEX TASKS OF AN AUTONOMOUS MOBILE ROBOT

MULTI-LAYERED HYBRID ARCHITECTURE TO SOLVE COMPLEX TASKS OF AN AUTONOMOUS MOBILE ROBOT MULTI-LAYERED HYBRID ARCHITECTURE TO SOLVE COMPLEX TASKS OF AN AUTONOMOUS MOBILE ROBOT F. TIECHE, C. FACCHINETTI and H. HUGLI Institute of Microtechnology, University of Neuchâtel, Rue de Tivoli 28, CH-2003

More information

From Encoding Sound to Encoding Touch

From Encoding Sound to Encoding Touch From Encoding Sound to Encoding Touch Toktam Mahmoodi King s College London, UK http://www.ctr.kcl.ac.uk/toktam/index.htm ETSI STQ Workshop, May 2017 Immersing a person into the real environment with Very

More information

Using Reactive and Adaptive Behaviors to Play Soccer

Using Reactive and Adaptive Behaviors to Play Soccer AI Magazine Volume 21 Number 3 (2000) ( AAAI) Articles Using Reactive and Adaptive Behaviors to Play Soccer Vincent Hugel, Patrick Bonnin, and Pierre Blazevic This work deals with designing simple behaviors

More information

FreeMotionHandling Autonomously flying gripping sphere

FreeMotionHandling Autonomously flying gripping sphere FreeMotionHandling Autonomously flying gripping sphere FreeMotionHandling Flying assistant system for handling in the air 01 Both flying and gripping have a long tradition in the Festo Bionic Learning

More information

UNIVERSITY OF NORTH CAROLINA AT CHARLOTTE

UNIVERSITY OF NORTH CAROLINA AT CHARLOTTE UNIVERSITY OF NORTH CAROLINA AT CHARLOTTE Department of Electrical and Computer Engineering ECGR 4161/5196 Introduction to Robotics Experiment No. 4 Tilt Detection Using Accelerometer Overview: The purpose

More information

Robot: icub This humanoid helps us study the brain

Robot: icub This humanoid helps us study the brain ProfileArticle Robot: icub This humanoid helps us study the brain For the complete profile with media resources, visit: http://education.nationalgeographic.org/news/robot-icub/ Program By Robohub Tuesday,

More information

An interdisciplinary collaboration of Theatre Arts and Social Robotics: The creation of empathy and embodiment in social robotics

An interdisciplinary collaboration of Theatre Arts and Social Robotics: The creation of empathy and embodiment in social robotics An interdisciplinary collaboration of Theatre Arts and Social Robotics: The creation of empathy and embodiment in social robotics Empathy: the ability to understand and share the feelings of another. Embodiment:

More information

Fuzzy-Heuristic Robot Navigation in a Simulated Environment

Fuzzy-Heuristic Robot Navigation in a Simulated Environment Fuzzy-Heuristic Robot Navigation in a Simulated Environment S. K. Deshpande, M. Blumenstein and B. Verma School of Information Technology, Griffith University-Gold Coast, PMB 50, GCMC, Bundall, QLD 9726,

More information

Where C= circumference, π = 3.14, and D = diameter EV3 Distance. Developed by Joanna M. Skluzacek Wisconsin 4-H 2016 Page 1

Where C= circumference, π = 3.14, and D = diameter EV3 Distance. Developed by Joanna M. Skluzacek Wisconsin 4-H 2016 Page 1 Instructor Guide Title: Distance the robot will travel based on wheel size Introduction Calculating the distance the robot will travel for each of the duration variables (rotations, degrees, seconds) can

More information

Vishnu Nath. Usage of computer vision and humanoid robotics to create autonomous robots. (Ximea Currera RL04C Camera Kit)

Vishnu Nath. Usage of computer vision and humanoid robotics to create autonomous robots. (Ximea Currera RL04C Camera Kit) Vishnu Nath Usage of computer vision and humanoid robotics to create autonomous robots (Ximea Currera RL04C Camera Kit) Acknowledgements Firstly, I would like to thank Ivan Klimkovic of Ximea Corporation,

More information

Socially Interactive Robots for Real Life Use

Socially Interactive Robots for Real Life Use Socially Interactive Robots for Real Life Use F. Michaud, C. Côté, D. Létourneau, Y. Brosseau, J.-M. Valin, É. Beaudry, C. Raïevsky, A. Ponchon, P. Moisan, P. Lepage, Y. Morin, F. Gagnon, P. Giguère, A.

More information

VOICE CONTROL BASED PROSTHETIC HUMAN ARM

VOICE CONTROL BASED PROSTHETIC HUMAN ARM VOICE CONTROL BASED PROSTHETIC HUMAN ARM Ujwal R 1, Rakshith Narun 2, Harshell Surana 3, Naga Surya S 4, Ch Preetham Dheeraj 5 1.2.3.4.5. Student, Department of Electronics and Communication Engineering,

More information

Sample Test Project District / Zonal Skill Competitions Skill- Mobile Robotic Category: Manufacturing & Engineering Technology

Sample Test Project District / Zonal Skill Competitions Skill- Mobile Robotic Category: Manufacturing & Engineering Technology Sample Test Project District / Zonal Skill Competitions Skill- Mobile Robotic Category: Manufacturing & Engineering Technology Version 1 Dec 2017 Skill- Mobile Robotic 1 Table of Contents A. Preface...

More information

ROBOTICS ENG YOUSEF A. SHATNAWI INTRODUCTION

ROBOTICS ENG YOUSEF A. SHATNAWI INTRODUCTION ROBOTICS INTRODUCTION THIS COURSE IS TWO PARTS Mobile Robotics. Locomotion (analogous to manipulation) (Legged and wheeled robots). Navigation and obstacle avoidance algorithms. Robot Vision Sensors and

More information

Playware Research Methodological Considerations

Playware Research Methodological Considerations Journal of Robotics, Networks and Artificial Life, Vol. 1, No. 1 (June 2014), 23-27 Playware Research Methodological Considerations Henrik Hautop Lund Centre for Playware, Technical University of Denmark,

More information

Implications on Humanoid Robots in Pedagogical Applications from Cross-Cultural Analysis between Japan, Korea, and the USA

Implications on Humanoid Robots in Pedagogical Applications from Cross-Cultural Analysis between Japan, Korea, and the USA Implications on Humanoid Robots in Pedagogical Applications from Cross-Cultural Analysis between Japan, Korea, and the USA Tatsuya Nomura,, No Member, Takayuki Kanda, Member, IEEE, Tomohiro Suzuki, No

More information

AN HYBRID LOCOMOTION SERVICE ROBOT FOR INDOOR SCENARIOS 1

AN HYBRID LOCOMOTION SERVICE ROBOT FOR INDOOR SCENARIOS 1 AN HYBRID LOCOMOTION SERVICE ROBOT FOR INDOOR SCENARIOS 1 Jorge Paiva Luís Tavares João Silva Sequeira Institute for Systems and Robotics Institute for Systems and Robotics Instituto Superior Técnico,

More information

Franοcois Michaud and Minh Tuan Vu. LABORIUS - Research Laboratory on Mobile Robotics and Intelligent Systems

Franοcois Michaud and Minh Tuan Vu. LABORIUS - Research Laboratory on Mobile Robotics and Intelligent Systems Light Signaling for Social Interaction with Mobile Robots Franοcois Michaud and Minh Tuan Vu LABORIUS - Research Laboratory on Mobile Robotics and Intelligent Systems Department of Electrical and Computer

More information

Wirelessly Controlled Wheeled Robotic Arm

Wirelessly Controlled Wheeled Robotic Arm Wirelessly Controlled Wheeled Robotic Arm Muhammmad Tufail 1, Mian Muhammad Kamal 2, Muhammad Jawad 3 1 Department of Electrical Engineering City University of science and Information Technology Peshawar

More information

Participatory Design (PD) for assistive robots. Hee Rin Lee UC San Diego

Participatory Design (PD) for assistive robots. Hee Rin Lee UC San Diego Participatory Design (PD) for assistive robots Hee Rin Lee UC San Diego 1. Intro to Participatory Design (PD) What is Participatory Design (PD) Participatory Design (PD) represents [an] approach towards

More information

Development of a telepresence agent

Development of a telepresence agent Author: Chung-Chen Tsai, Yeh-Liang Hsu (2001-04-06); recommended: Yeh-Liang Hsu (2001-04-06); last updated: Yeh-Liang Hsu (2004-03-23). Note: This paper was first presented at. The revised paper was presented

More information

CURIE Academy, Summer 2014 Lab 2: Computer Engineering Software Perspective Sign-Off Sheet

CURIE Academy, Summer 2014 Lab 2: Computer Engineering Software Perspective Sign-Off Sheet Lab : Computer Engineering Software Perspective Sign-Off Sheet NAME: NAME: DATE: Sign-Off Milestone TA Initials Part 1.A Part 1.B Part.A Part.B Part.C Part 3.A Part 3.B Part 3.C Test Simple Addition Program

More information

Agent-based/Robotics Programming Lab II

Agent-based/Robotics Programming Lab II cis3.5, spring 2009, lab IV.3 / prof sklar. Agent-based/Robotics Programming Lab II For this lab, you will need a LEGO robot kit, a USB communications tower and a LEGO light sensor. 1 start up RoboLab

More information

Informing a User of Robot s Mind by Motion

Informing a User of Robot s Mind by Motion Informing a User of Robot s Mind by Motion Kazuki KOBAYASHI 1 and Seiji YAMADA 2,1 1 The Graduate University for Advanced Studies 2-1-2 Hitotsubashi, Chiyoda, Tokyo 101-8430 Japan kazuki@grad.nii.ac.jp

More information

The Integument Laboratory

The Integument Laboratory Name Period Ms. Pfeil A# Activity: 1 Visualizing Changes in Skin Color Due to Continuous External Pressure Go to the supply area and obtain a small glass plate. Press the heel of your hand firmly against

More information

2. Publishable summary

2. Publishable summary 2. Publishable summary CogLaboration (Successful real World Human-Robot Collaboration: from the cognition of human-human collaboration to fluent human-robot collaboration) is a specific targeted research

More information

H2020 RIA COMANOID H2020-RIA

H2020 RIA COMANOID H2020-RIA Ref. Ares(2016)2533586-01/06/2016 H2020 RIA COMANOID H2020-RIA-645097 Deliverable D4.1: Demonstrator specification report M6 D4.1 H2020-RIA-645097 COMANOID M6 Project acronym: Project full title: COMANOID

More information

CYCLIC GENETIC ALGORITHMS FOR EVOLVING MULTI-LOOP CONTROL PROGRAMS

CYCLIC GENETIC ALGORITHMS FOR EVOLVING MULTI-LOOP CONTROL PROGRAMS CYCLIC GENETIC ALGORITHMS FOR EVOLVING MULTI-LOOP CONTROL PROGRAMS GARY B. PARKER, CONNECTICUT COLLEGE, USA, parker@conncoll.edu IVO I. PARASHKEVOV, CONNECTICUT COLLEGE, USA, iipar@conncoll.edu H. JOSEPH

More information

Robots in the Loop: Supporting an Incremental Simulation-based Design Process

Robots in the Loop: Supporting an Incremental Simulation-based Design Process s in the Loop: Supporting an Incremental -based Design Process Xiaolin Hu Computer Science Department Georgia State University Atlanta, GA, USA xhu@cs.gsu.edu Abstract This paper presents the results of

More information

Available online at ScienceDirect. Procedia Computer Science 76 (2015 )

Available online at   ScienceDirect. Procedia Computer Science 76 (2015 ) Available online at www.sciencedirect.com ScienceDirect Procedia Computer Science 76 (2015 ) 474 479 2015 IEEE International Symposium on Robotics and Intelligent Sensors (IRIS 2015) Sensor Based Mobile

More information

Interaction Debugging: an Integral Approach to Analyze Human-Robot Interaction

Interaction Debugging: an Integral Approach to Analyze Human-Robot Interaction Interaction Debugging: an Integral Approach to Analyze Human-Robot Interaction Tijn Kooijmans 1,2 Takayuki Kanda 1 Christoph Bartneck 2 Hiroshi Ishiguro 1,3 Norihiro Hagita 1 1 ATR Intelligent Robotics

More information

Modeling Human-Robot Interaction for Intelligent Mobile Robotics

Modeling Human-Robot Interaction for Intelligent Mobile Robotics Modeling Human-Robot Interaction for Intelligent Mobile Robotics Tamara E. Rogers, Jian Peng, and Saleh Zein-Sabatto College of Engineering, Technology, and Computer Science Tennessee State University

More information

Optimization Maze Robot Using A* and Flood Fill Algorithm

Optimization Maze Robot Using A* and Flood Fill Algorithm International Journal of Mechanical Engineering and Robotics Research Vol., No. 5, September 2017 Optimization Maze Robot Using A* and Flood Fill Algorithm Semuil Tjiharjadi, Marvin Chandra Wijaya, and

More information

Learning serious knowledge while "playing"with robots

Learning serious knowledge while playingwith robots 6 th International Conference on Applied Informatics Eger, Hungary, January 27 31, 2004. Learning serious knowledge while "playing"with robots Zoltán Istenes Department of Software Technology and Methodology,

More information

Discrimination of Virtual Haptic Textures Rendered with Different Update Rates

Discrimination of Virtual Haptic Textures Rendered with Different Update Rates Discrimination of Virtual Haptic Textures Rendered with Different Update Rates Seungmoon Choi and Hong Z. Tan Haptic Interface Research Laboratory Purdue University 465 Northwestern Avenue West Lafayette,

More information

Heads up interaction: glasgow university multimodal research. Eve Hoggan

Heads up interaction: glasgow university multimodal research. Eve Hoggan Heads up interaction: glasgow university multimodal research Eve Hoggan www.tactons.org multimodal interaction Multimodal Interaction Group Key area of work is Multimodality A more human way to work Not

More information

Robot: Robonaut 2 The first humanoid robot to go to outer space

Robot: Robonaut 2 The first humanoid robot to go to outer space ProfileArticle Robot: Robonaut 2 The first humanoid robot to go to outer space For the complete profile with media resources, visit: http://education.nationalgeographic.org/news/robot-robonaut-2/ Program

More information

Simultaneous presentation of tactile and auditory motion on the abdomen to realize the experience of being cut by a sword

Simultaneous presentation of tactile and auditory motion on the abdomen to realize the experience of being cut by a sword Simultaneous presentation of tactile and auditory motion on the abdomen to realize the experience of being cut by a sword Sayaka Ooshima 1), Yuki Hashimoto 1), Hideyuki Ando 2), Junji Watanabe 3), and

More information

Head-Movement Evaluation for First-Person Games

Head-Movement Evaluation for First-Person Games Head-Movement Evaluation for First-Person Games Paulo G. de Barros Computer Science Department Worcester Polytechnic Institute 100 Institute Road. Worcester, MA 01609 USA pgb@wpi.edu Robert W. Lindeman

More information

Figure 1. Overall Picture

Figure 1. Overall Picture Jormungand, an Autonomous Robotic Snake Charles W. Eno, Dr. A. Antonio Arroyo Machine Intelligence Laboratory University of Florida Department of Electrical Engineering 1. Introduction In the Intelligent

More information

Design and Development of Novel Two Axis Servo Control Mechanism

Design and Development of Novel Two Axis Servo Control Mechanism Design and Development of Novel Two Axis Servo Control Mechanism Shailaja Kurode, Chinmay Dharmadhikari, Mrinmay Atre, Aniruddha Katti, Shubham Shambharkar Abstract This paper presents design and development

More information

Implicit Fitness Functions for Evolving a Drawing Robot

Implicit Fitness Functions for Evolving a Drawing Robot Implicit Fitness Functions for Evolving a Drawing Robot Jon Bird, Phil Husbands, Martin Perris, Bill Bigge and Paul Brown Centre for Computational Neuroscience and Robotics University of Sussex, Brighton,

More information

University of Toronto. Companion Robot Security. ECE1778 Winter Wei Hao Chang Apper Alexander Hong Programmer

University of Toronto. Companion Robot Security. ECE1778 Winter Wei Hao Chang Apper Alexander Hong Programmer University of Toronto Companion ECE1778 Winter 2015 Creative Applications for Mobile Devices Wei Hao Chang Apper Alexander Hong Programmer April 9, 2015 Contents 1 Introduction 3 1.1 Problem......................................

More information

Design of an office guide robot for social interaction studies

Design of an office guide robot for social interaction studies Design of an office guide robot for social interaction studies Elena Pacchierotti, Henrik I. Christensen & Patric Jensfelt Centre for Autonomous Systems Royal Institute of Technology, Stockholm, Sweden

More information

Welcome to this course on «Natural Interactive Walking on Virtual Grounds»!

Welcome to this course on «Natural Interactive Walking on Virtual Grounds»! Welcome to this course on «Natural Interactive Walking on Virtual Grounds»! The speaker is Anatole Lécuyer, senior researcher at Inria, Rennes, France; More information about him at : http://people.rennes.inria.fr/anatole.lecuyer/

More information

Robot Olympics: Programming Robots to Perform Tasks in the Real World

Robot Olympics: Programming Robots to Perform Tasks in the Real World Robot Olympics: Programming Robots to Perform Tasks in the Real World Coranne Lipford Faculty of Computer Science Dalhousie University, Canada lipford@cs.dal.ca Raymond Walsh Faculty of Computer Science

More information

Enhanced Collision Perception Using Tactile Feedback

Enhanced Collision Perception Using Tactile Feedback Department of Computer & Information Science Technical Reports (CIS) University of Pennsylvania Year 2003 Enhanced Collision Perception Using Tactile Feedback Aaron Bloomfield Norman I. Badler University

More information

Gesture Identification Using Sensors Future of Interaction with Smart Phones Mr. Pratik Parmar 1 1 Department of Computer engineering, CTIDS

Gesture Identification Using Sensors Future of Interaction with Smart Phones Mr. Pratik Parmar 1 1 Department of Computer engineering, CTIDS Gesture Identification Using Sensors Future of Interaction with Smart Phones Mr. Pratik Parmar 1 1 Department of Computer engineering, CTIDS Abstract Over the years from entertainment to gaming market,

More information

Lab book. Exploring Robotics (CORC3303)

Lab book. Exploring Robotics (CORC3303) Lab book Exploring Robotics (CORC3303) Dept of Computer and Information Science Brooklyn College of the City University of New York updated: Fall 2011 / Professor Elizabeth Sklar UNIT A Lab, part 1 : Robot

More information

Sample Pages. Classroom Activities for the Busy Teacher: NXT. 2 nd Edition. Classroom Activities for the Busy Teacher: NXT -

Sample Pages. Classroom Activities for the Busy Teacher: NXT. 2 nd Edition. Classroom Activities for the Busy Teacher: NXT - Classroom Activities for the Busy Teacher: NXT 2 nd Edition Table of Contents Chapter 1: Introduction... 1 Chapter 2: What is a robot?... 5 Chapter 3: Flowcharting... 11 Chapter 4: DomaBot Basics... 15

More information

Real-time Social Touch Gesture Recognition for Sensate Robots

Real-time Social Touch Gesture Recognition for Sensate Robots Real-time Social Touch Gesture Recognition for Sensate Robots Heather Knight*, Robert Toscano*, Walter D. Stiehl*, Angela Chang*, Yi Wang, Cynthia Breazeal*, *Members, IEEE Abstract This paper describes

More information

USER S MANUAL AGES: 8+ R O B O R A P T O R. A F u s i o n o f T e c h n o l o g y a n d P e r s o n a l i t y

USER S MANUAL AGES: 8+ R O B O R A P T O R. A F u s i o n o f T e c h n o l o g y a n d P e r s o n a l i t y USER S MANUAL ITEM NO. 8095 AGES: 8+ www.roboraptoronline.com R O B O R A P T O R A F u s i o n o f T e c h n o l o g y a n d P e r s o n a l i t y Table of Contents Roboraptor Overview... 1 Controller

More information

Behaviour-Based Control. IAR Lecture 5 Barbara Webb

Behaviour-Based Control. IAR Lecture 5 Barbara Webb Behaviour-Based Control IAR Lecture 5 Barbara Webb Traditional sense-plan-act approach suggests a vertical (serial) task decomposition Sensors Actuators perception modelling planning task execution motor

More information

Table of Contents. Sample Pages - get the whole book at

Table of Contents. Sample Pages - get the whole book at Table of Contents Chapter 1: Introduction... 1 Chapter 2: minivex Basics... 4 Chapter 3: What is a Robot?... 20 Chapter 4: Flowcharting... 25 Chapter 5: How Far?... 28 Chapter 6: How Fast?... 32 Chapter

More information

Design of an Office-Guide Robot for Social Interaction Studies

Design of an Office-Guide Robot for Social Interaction Studies Proceedings of the 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems October 9-15, 2006, Beijing, China Design of an Office-Guide Robot for Social Interaction Studies Elena Pacchierotti,

More information

Developing Frogger Player Intelligence Using NEAT and a Score Driven Fitness Function

Developing Frogger Player Intelligence Using NEAT and a Score Driven Fitness Function Developing Frogger Player Intelligence Using NEAT and a Score Driven Fitness Function Davis Ancona and Jake Weiner Abstract In this report, we examine the plausibility of implementing a NEAT-based solution

More information

EMERGENCE OF COMMUNICATION IN TEAMS OF EMBODIED AND SITUATED AGENTS

EMERGENCE OF COMMUNICATION IN TEAMS OF EMBODIED AND SITUATED AGENTS EMERGENCE OF COMMUNICATION IN TEAMS OF EMBODIED AND SITUATED AGENTS DAVIDE MAROCCO STEFANO NOLFI Institute of Cognitive Science and Technologies, CNR, Via San Martino della Battaglia 44, Rome, 00185, Italy

More information

Motion Control of a Three Active Wheeled Mobile Robot and Collision-Free Human Following Navigation in Outdoor Environment

Motion Control of a Three Active Wheeled Mobile Robot and Collision-Free Human Following Navigation in Outdoor Environment Proceedings of the International MultiConference of Engineers and Computer Scientists 2016 Vol I,, March 16-18, 2016, Hong Kong Motion Control of a Three Active Wheeled Mobile Robot and Collision-Free

More information

Haptic control in a virtual environment

Haptic control in a virtual environment Haptic control in a virtual environment Gerard de Ruig (0555781) Lourens Visscher (0554498) Lydia van Well (0566644) September 10, 2010 Introduction With modern technological advancements it is entirely

More information

Gael Force FRC Team 126

Gael Force FRC Team 126 Gael Force FRC Team 126 2018 FIRST Robotics Competition 2018 Robot Information and Specs Judges Information Packet Gael Force is proof that one team from a small town can have an incredible impact on many

More information

Full-body Gesture Recognition Using Inertial Sensors for Playful Interaction with Small Humanoid Robot

Full-body Gesture Recognition Using Inertial Sensors for Playful Interaction with Small Humanoid Robot The 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems October 18-22, 2010, Taipei, Taiwan Full-body Gesture Recognition Using Inertial Sensors for Playful Interaction with Small

More information

PeakVue Analysis for Antifriction Bearing Fault Detection

PeakVue Analysis for Antifriction Bearing Fault Detection Machinery Health PeakVue Analysis for Antifriction Bearing Fault Detection Peak values (PeakVue) are observed over sequential discrete time intervals, captured, and analyzed. The analyses are the (a) peak

More information

ARTIFICIAL INTELLIGENCE - ROBOTICS

ARTIFICIAL INTELLIGENCE - ROBOTICS ARTIFICIAL INTELLIGENCE - ROBOTICS http://www.tutorialspoint.com/artificial_intelligence/artificial_intelligence_robotics.htm Copyright tutorialspoint.com Robotics is a domain in artificial intelligence

More information

UNIT VI. Current approaches to programming are classified as into two major categories:

UNIT VI. Current approaches to programming are classified as into two major categories: Unit VI 1 UNIT VI ROBOT PROGRAMMING A robot program may be defined as a path in space to be followed by the manipulator, combined with the peripheral actions that support the work cycle. Peripheral actions

More information

Comparing Human Robot Interaction Scenarios Using Live and Video Based Methods: Towards a Novel Methodological Approach

Comparing Human Robot Interaction Scenarios Using Live and Video Based Methods: Towards a Novel Methodological Approach Comparing Human Robot Interaction Scenarios Using Live and Video Based Methods: Towards a Novel Methodological Approach Sarah Woods, Michael Walters, Kheng Lee Koay, Kerstin Dautenhahn Adaptive Systems

More information

Evaluation of Five-finger Haptic Communication with Network Delay

Evaluation of Five-finger Haptic Communication with Network Delay Tactile Communication Haptic Communication Network Delay Evaluation of Five-finger Haptic Communication with Network Delay To realize tactile communication, we clarify some issues regarding how delay affects

More information

Humanoid robots: will they ever be able to become like us, and if so, do we want this to happen?

Humanoid robots: will they ever be able to become like us, and if so, do we want this to happen? Humanoid robots: will they ever be able to become like us, and if so, do we want this to happen? Benjamin Schnieders April 17, 2011 Abstract This essay will shortly discuss the question whether there will

More information

CB 2 : A Child Robot with Biomimetic Body for Cognitive Developmental Robotics

CB 2 : A Child Robot with Biomimetic Body for Cognitive Developmental Robotics CB 2 : A Child Robot with Biomimetic Body for Cognitive Developmental Robotics Takashi Minato #1, Yuichiro Yoshikawa #2, Tomoyuki da 3, Shuhei Ikemoto 4, Hiroshi Ishiguro # 5, and Minoru Asada # 6 # Asada

More information

Multi-Modality Fidelity in a Fixed-Base- Fully Interactive Driving Simulator

Multi-Modality Fidelity in a Fixed-Base- Fully Interactive Driving Simulator Multi-Modality Fidelity in a Fixed-Base- Fully Interactive Driving Simulator Daniel M. Dulaski 1 and David A. Noyce 2 1. University of Massachusetts Amherst 219 Marston Hall Amherst, Massachusetts 01003

More information

Using Dynamic Capability Evaluation to Organize a Team of Cooperative, Autonomous Robots

Using Dynamic Capability Evaluation to Organize a Team of Cooperative, Autonomous Robots Using Dynamic Capability Evaluation to Organize a Team of Cooperative, Autonomous Robots Eric Matson Scott DeLoach Multi-agent and Cooperative Robotics Laboratory Department of Computing and Information

More information

A Pilot Study: Introduction of Time-domain Segment to Intensity-based Perception Model of High-frequency Vibration

A Pilot Study: Introduction of Time-domain Segment to Intensity-based Perception Model of High-frequency Vibration A Pilot Study: Introduction of Time-domain Segment to Intensity-based Perception Model of High-frequency Vibration Nan Cao, Hikaru Nagano, Masashi Konyo, Shogo Okamoto 2 and Satoshi Tadokoro Graduate School

More information