ToBI - Team of Bielefeld A Human-Robot Interaction System for 2017

Similar documents
ToBI - Team of Bielefeld A Human-Robot Interaction System for 2018

ToBI - Team of Bielefeld A Human-Robot Interaction System for 2019

The Cognitive Service Robotics Apartment

Modeling Software Systems in Experimental Robotics for Improved Reproducibility

The Cognitive Interaction Toolkit Improving Reproducibility of Robotic Systems Experiments

Sven Wachsmuth Bielefeld University

2 Focus of research and research interests

Benchmarking Intelligent Service Robots through Scientific Competitions. Luca Iocchi. Sapienza University of Rome, Italy

Benchmarking Intelligent Service Robots through Scientific Competitions: the approach. Luca Iocchi. Sapienza University of Rome, Italy

Towards Addressee Recognition in Smart Robotic Environments

* Intelli Robotic Wheel Chair for Specialty Operations & Physically Challenged

NCCT IEEE PROJECTS ADVANCED ROBOTICS SOLUTIONS. Latest Projects, in various Domains. Promise for the Best Projects

Technical issues of MRL Virtual Robots Team RoboCup 2016, Leipzig Germany

CORC 3303 Exploring Robotics. Why Teams?

Global Variable Team Description Paper RoboCup 2018 Rescue Virtual Robot League

BORG. The team of the University of Groningen Team Description Paper

ENHANCED HUMAN-AGENT INTERACTION: AUGMENTING INTERACTION MODELS WITH EMBODIED AGENTS BY SERAFIN BENTO. MASTER OF SCIENCE in INFORMATION SYSTEMS

Construction of Mobile Robots

Major Project SSAD. Mentor : Raghudeep SSAD Mentor :Manish Jha Group : Group20 Members : Harshit Daga ( ) Aman Saxena ( )

Perception. Read: AIMA Chapter 24 & Chapter HW#8 due today. Vision

SPQR RoboCup 2016 Standard Platform League Qualification Report

1 Abstract and Motivation

League 2017 Team Description Paper

RoboCup. Presented by Shane Murphy April 24, 2003

Advanced Robotics Introduction

CPE Lyon Robot Forum, 2016 Team Description Paper

DEVELOPMENT OF A ROBOID COMPONENT FOR PLAYER/STAGE ROBOT SIMULATOR

UvA Rescue Team Description Paper Infrastructure competition Rescue Simulation League RoboCup Jo~ao Pessoa - Brazil

Team Description Paper: Darmstadt Dribblers & Hajime Team (KidSize) and Darmstadt Dribblers (TeenSize)

Team Description Paper

Designing Appropriate Feedback for Virtual Agents and Robots

Robotic Applications Industrial/logistics/medical robots

Team Description

FP7 ICT Call 6: Cognitive Systems and Robotics

ARCHITECTURE AND MODEL OF DATA INTEGRATION BETWEEN MANAGEMENT SYSTEMS AND AGRICULTURAL MACHINES FOR PRECISION AGRICULTURE

Wednesday, October 29, :00-04:00pm EB: 3546D. TELEOPERATION OF MOBILE MANIPULATORS By Yunyi Jia Advisor: Prof.

Accessible Power Tool Flexible Application Scalable Solution

CAPACITIES FOR TECHNOLOGY TRANSFER

Semi-Autonomous Parking for Enhanced Safety and Efficiency

Face Detector using Network-based Services for a Remote Robot Application

ZJUDancer Team Description Paper Humanoid Kid-Size League of Robocup 2015

FU-Fighters. The Soccer Robots of Freie Universität Berlin. Why RoboCup? What is RoboCup?

ZJUDancer Team Description Paper Humanoid Kid-Size League of Robocup 2014

Advanced Robotics Introduction

CMDragons 2009 Team Description

Simulation and HRI Recent Perspectives with the MORSE Simulator

H2020 RIA COMANOID H2020-RIA

Nao Devils Dortmund. Team Description for RoboCup Matthias Hofmann, Ingmar Schwarz, and Oliver Urbann

The Future of AI A Robotics Perspective

Team MU-L8 Humanoid League TeenSize Team Description Paper 2014

Rapid Development System for Humanoid Vision-based Behaviors with Real-Virtual Common Interface

AGILO RoboCuppers 2004

Formation and Cooperation for SWARMed Intelligent Robots

Building Perceptive Robots with INTEL Euclid Development kit

3-DEMON MONITORING PLATFORM: EXAMPLES OF APPLICATIONS IN STRUCTURAL AND GEOTECHNICAL MONITORING PROJECTS

Mobile Robots Exploration and Mapping in 2D

ACTIVE, A PLATFORM FOR BUILDING INTELLIGENT OPERATING ROOMS

HASHICORP TERRAFORM AND RED HAT ANSIBLE AUTOMATION Infrastructure as code automation

Service Robots in an Intelligent House

Horizon 2020 ICT Robotics Work Programme (draft - Publication: 20 October 2015)

with permission from World Scientific Publishing Co. Pte. Ltd.

AN HYBRID LOCOMOTION SERVICE ROBOT FOR INDOOR SCENARIOS 1

An Open Robot Simulator Environment

GESTURE BASED HUMAN MULTI-ROBOT INTERACTION. Gerard Canal, Cecilio Angulo, and Sergio Escalera

The 2012 Team Description

Open Source Voices Interview Series Podcast, Episode 03: How Is Open Source Important to the Future of Robotics? English Transcript

Constructing the Ubiquitous Intelligence Model based on Frame and High-Level Petri Nets for Elder Healthcare

Optic Flow Based Skill Learning for A Humanoid to Trap, Approach to, and Pass a Ball

ENGINEERING SERVICE-ORIENTED ROBOTIC SYSTEMS

Person Tracking with a Mobile Robot based on Multi-Modal Anchoring

Great Minds. Internship Program IBM Research - China

Swarm Intelligence W7: Application of Machine- Learning Techniques to Automatic Control Design and Optimization

MULTI-LAYERED HYBRID ARCHITECTURE TO SOLVE COMPLEX TASKS OF AN AUTONOMOUS MOBILE ROBOT

MAKER: Development of Smart Mobile Robot System to Help Middle School Students Learn about Robot Perception

Baset Adult-Size 2016 Team Description Paper

UvA Rescue - Team Description Paper - Infrastructure competition - Rescue Simulation League RoboCup João Pessoa - Brazil Visser, A.

Kinect Interface for UC-win/Road: Application to Tele-operation of Small Robots

PYBOSSA Technology. What is PYBOSSA?

Proseminar Roboter und Aktivmedien. Outline of today s lecture. Acknowledgments. Educational robots achievements and challenging

A SERVICE-ORIENTED SYSTEM ARCHITECTURE FOR THE HUMAN CENTERED DESIGN OF INTELLIGENT TRANSPORTATION SYSTEMS

CS594, Section 30682:

MarineSIM : Robot Simulation for Marine Environments

Correcting Odometry Errors for Mobile Robots Using Image Processing

SPL 2017 Team Description Paper

On past, present and future of a scientific competition for service robots

NimbRo 2005 Team Description

What will the robot do during the final demonstration?

Gerrit Meixner Head of the Center for Human-Machine-Interaction (ZMMI)

Multisensory Based Manipulation Architecture

KI-SUNG SUH USING NAO INTRODUCTION TO INTERACTIVE HUMANOID ROBOTS

2. Publishable summary

Multi-Robot Cooperative System For Object Detection

Performance evaluation and benchmarking in EU-funded activities. ICRA May 2011

HELPING THE DESIGN OF MIXED SYSTEMS

Development of an Interactive Humanoid Robot Robovie - An interdisciplinary research approach between cognitive science and robotics -

Content. 3 Preface 4 Who We Are 6 The RoboCup Initiative 7 Our Robots 8 Hardware 10 Software 12 Public Appearances 14 Achievements 15 Interested?

Creating a 3D environment map from 2D camera images in robotics

AN AUTONOMOUS SIMULATION BASED SYSTEM FOR ROBOTIC SERVICES IN PARTIALLY KNOWN ENVIRONMENTS

Analysis of Human-Robot Spatial Behaviour applying a Qualitative Trajectory Calculus

Franka Emika GmbH. Our vision of a robot for everyone sensitive, interconnected, adaptive and cost-efficient.

Transcription:

ToBI - Team of Bielefeld A Human-Robot Interaction System for RoboCup@Home 2017 Sven Wachsmuth, Florian Lier, Sebastian Meyer zu Borgsen, Johannes Kummert, Luca Lach, and Dominik Sixt Exzellenzcluster Cognitive Interaction Technology (CITEC), Bielefeld University, Inspiration 1, 33615 Bielefeld, Germany http://www.cit-ec.de/de/tobi Abstract. The Team of Bielefeld (ToBI) has been founded in 2009. The RoboCup teams activities are embedded in a long-term research agenda towards human-robot interaction with laypersons in regular and smart home environments. The RoboCup@Home competition is an important benchmark and milestone for this goal in terms of robot capabilities as well as the system integration effort. In order to achieve a robust and stable system performance, we apply a systematic approach for reproducible robotic experimentation including automatic tests. For RoboCup 2017, we plan to enhance this approach by simulating complete RoboCup@Home tasks. We further extend it to the RoboCup@Home standard platform Pepper. Similar to the Nao platform, the Pepper comes with its own runtime and development eco-system. Thus, one of the challenges will be the cross-platform transfer of capabilities between robots based on different eco-system, e.g. the utilized middleware and application layers. In this paper, we will present a generic approach to such issues: the Cognitive Interaction Toolkit. The overall framework inherently supports the idea of open research and offers direct access to reusable components and reproducible systems via a web-based catalog. A main focus of research at Bielefeld are robots as an ambient host in a smart home or for instance as a museum s guide. Both scenarios are highly relevant for the RoboCup@Home standard platform competition. Skills developed in these domains will be transferred to the RoboCup@Home scenarios. 1 Introduction The RoboCup@Home competition aims at bringing robotic platforms to use in realistic domestic environments. Today s robotic systems obtain a big part of their abilities through the combination of different software components from different research areas. To be able to communicate with humans and interact with the environment, robots need to coordinate and dynamically configure their components in order to generate an appropriate overall robot behavior that fulfills parallel goals such as gathering scene information, achieving a task goal,

2 S. Wachsmuth et al. communicate their internal status, and being always responsive to humans. This is especially relevant for complex scenarios in domestic settings. The Team of Bielefeld (ToBI) was founded in 2009 and successfully participated in the RoboCup German Open as well as the RoboCup World Cup from 2009 to 2016. In 2016, the team ended first in several of the individual tests (Navigation, Person Recognition, GPSR, EE-GPSR, Restaurant) and, finally, won the global competition [1]. Bielefeld University is involved in research on human-robot interaction for more than 20 years especially gaining experience in experimental studies with integrated robotic systems [2, 3]. An important lesson learned is that the reproducibility of robotic systems and their performance is critical to show the incremental progress but that this is rarely achieved [4]. This applies to experimentation in robotics as well as to RoboCup. A Technical Description Paper (e.g. [5]) as typically submitted to RoboCup competitions is by far not sufficient to describe or even reproduce a robotic system with all its artifacts. The introduction of a systematic approach towards reproducible robotic experiments [6] has been turned out as a key factor to maximally stabilize basic capabilities like, e.g., navigation or person following. Together with appropriate simulation engines [7] it paves the way to an automated testing of complete RoboCup@Home tasks. The Cognitive Interaction Toolkit provides a framework that allows to describe, deploy, and test systems independent of the underlying ecosystem. Thus, the concepts apply for ROS-based components and systems as well as for those defined with, e.g., NAOqi. Combined with an appropriate abstraction architecture, a reusability of components and behaviors can be achieved across platforms. In the Open Challenge and the Final of 2016, we introduced a multi-robot collaboration scenario that combines small mobile sensor devices with human-sized service robots demonstrating the scalability of the communication [8] and behavior [9] framework. This already showed that we are able to deal with cross-platform capabilities. Multi-robot scenarios are becoming more and more attractive for the @home domain, because there is an increasing number of intelligent devices in regular smart homes. The CITK framework has already been applied on the Nao platform. 1 Research using the Nao utilizes strategies for guiding the focus of attention of human visitors in a museum s context [10]. For this purpose the robot needs to follow the gaze of humans as well as provide referential behaviors. Further strategies are explored in a project that combines service robots with smart environments [11], e.g. the management of the robot s attention in a multi-user dialogue [12]. For the RoboCup@Home Pepper competition we further work on appropriate simulation approaches that allow to easily switch between the real hardware and a simulated environment including virtual sensors and actors. In order to keep our cross-platform approach, we utilized the MORSE Simulation framework [13] that additionally offers extended possibilities for modelling virtual human agents for testing human-robot interaction scenarios [14]. 1 https://toolkit.cit-ec.uni-bielefeld.de/systems/versions/ nao-minimal-nightly

ToBI - Team of Bielefeld for RoboCup@Home 2016 (a) Pepper (b) Biron (c) Floka 3 (d) AMiRo Fig. 1. Robotic platforms of ToBI. Pepper is 120cm tall, the overall height of Biron is 140cm. The Floka platform has an adjustable height between 160cm and 200cm. The AMiRo has a diameter of 10cm. 2 Robot Platforms In 2016, ToBI participated in RoboCup@Home with the two service robots Biron and Floka. Those were assisted by multiple instances of the smaller AMiRo as an extended mobile sensor platform. Figure 1 gives an overview of the three mentioned platforms together with Pepper as a new platform. We aim at the development of platform independent as well as multi-platform robot capabilties. Nevertheless, each platform has different actuators and sensors. The Social Standard Platform Pepper (cf. Fig. 1(a)) is newly introduced to the RoboCup@Home competition. It features an omnidirectional base, two ultrasonic and six laser sensors. Together with three obstacle detectors in his legs, these provide him with navigation and obstacle avoidance capabilities. Two RGB cameras, one 3D camera, and four directional microphones are placed in his head. It further possesses tactile sensors in his hands for social interaction. A tablet is mounted at the frontal body and allows the user to make choices or to visualize the internal state of the robot. In our setup we use an additional laptop as a external computing resource which is connected to the onboard computer of the Pepper via Wi-Fi. The robot platform Biron (cf. Fig. 1(b)) is based on the research platform GuiaBot by adept/mobilerobots, customized and equipped with sensors that allow analysis of the current situation. The Biron platform has been continuously developed since 2001 and has been used in RoboCup@Home since 2009. Its base

4 S. Wachsmuth et al. has a two-wheel differential drive and two laser range finders for localization, navigation, and mapping. Two Asus Xtion PRO LIVE Sensors on top of the base provide RGBD data for sensing obstacles, and graspable objects. For person detection/recognition we additionally use a full HD webcam of the type Logitech HD Pro Webcam C920. For object recognition we use a 24 mega pixel DSLM camera (Sony Alpha α6000). As microphones two Sennheiser MKE 400 are installed front- and back-facing, supported by two AKG C 400 BL on the sides. While the frontal microphones are used for speech recognition, the others are only used for speaker localization. Additionally, the robot is equipped with the Neuronics Katana 450 arm The following two robots will not be used at the RoboCup@Home 2017 competition. Nevertheless, they were used in previous competitions and demonstrate our platform independent and cross-platform approach. The Floka robot has several elements (omnidirectional base, two arms, lift-controlled torso) that are also featured in the Pepper platform. For human-robot interaction, the small AMiRos required a WiFi connection to an external computing resource. Similar concepts are now used for the Pepper platform. Our robot Floka (cf. Fig. 1(c)) is based on the Meka M1 Mobile Manipulator robotic-platform [1]. An omni-directional base with Holomni s caster-wheels and a lift-controlled torso enable navigating in complex environments. In total, the robot has 37 DoF, which break down to joints. It has 7 per arm, 5 per hand, 2 for the head, 2 in the torso, and 9 joints actuate the base including the z-lift. The motors in the arms, torso and hands are Series Elastic Actuators (SEAs), which enable force sensing. The sensor-head contains an RGBD and color camera. The AMiRo (cf. Fig. 1(d)) as used in RoboCup@Home is a two wheeled robot with a physical cylindrical shape [15]. It extends and enhances the capabilities of mobile service robots. Commonly, multiple AMiRos are applied in conjunction to build a multi-robotic setup which is interconnected via Wi-Fi. Each one consists of a set of stackable electronic modules for sensor processing, actuator control, and behavior coordination. 3 System Architecture Our service robots employ distributed systems with multiple clients sharing information over network. On these clients there are numerous software components written in different programming languages. Such heterogeneous systems require abstraction on several levels. Figure 2 gives an overview of the multiple layers of abstraction in the cooperating robot systems. Each column represents one type of robot. The behavior level (blue) represents the highest level of abstraction for all robots. This can be skills or complex behaviors. The robot specific software (green) and hardware component interfaces (red) are unified with the BonSAI Sensor Actuator Abstraction Layer (yellow). Even skills from the small AMiRo can be seamlessly

ToBI - Team of Bielefeld for RoboCup@Home 2016 5 Pepper (P) BonSAI Biron (B) BonSAI AMiRo (A) Behavior Fetch Bring Find A:Follow Talk Bring Find A:Follow Component Abstraction Software Component S/A Abstraction OS Navigation Obj Rec BonSAI Sensor Actuator Abstraction Layer MoveIt Face Rec NAOqi DCM control Navigation Obj Rec Drivers TTS Face Rec A:Follow SLAM Drivers GoTo Track Hardware Pepper Video PatrolBot Katana AMiRo Fig. 2. System architecture of ToBI s service robots. For Pepper software components are partially deployed on an external computing resource. AMiRo acts as an external sensor/actor for the other robots. The architecture abstracts from communication protocolls which are encapsulated by the BonSAI Sensor/Actuator Abstraction Layer. integrated into the behavior of the service robots. Thus, software components can be easily exchanged without changing any behaviors. The BonSAI layer also abstracts from the middleware and component models used on the robot which is handled on the component layer. As a consequence, a navigation skill may be defined using an appropriate ROS processing stack, while speech recognition may be defined in a different ecosystem. This approach easily extends to the processing framework of Pepper which is integrated via a ROS-NAOqi bridge. The explicit definition of skills in BonSAI also allows to reason about them and track their success during the performance of the robot. Based on this, new elements have been introduced last year, like reporting on success and failure of tasks assigned to the robot in GSPR. A further focus has been on the multi-robot cooperation with the AMiRo platforms. 3.1 Development, Testing, and Deployment Toolchain The software dependencies from operating system dependencies to intercomponent relations are completely modeled in the description of a system distribution which consists of a collection of so called recipes [6]. In order to foster reproducibility/traceability and potential software (component) re-use of the ToBI system, we provide a full specification of the 2016 system in our online catalog platform 2. The catalog provides detailed information about the soft- and hardware system including all utilized software components, as well as the facility 2 https://toolkit.cit-ec.uni-bielefeld.de/systems/versions/ robocup-champion-2016-2016-champion

6 S. Wachsmuth et al. Fig. 3. For efficient testing it is essential that the architecture can be easily switched between a simulation and the real robot without changing any interface. The right box represents the physical Pepper platform while the left boxes are running a virtual robot and a separate simulation environment with MORSE. Fig. 4. Cognitive Interaction Toolkit: tool chain and workflow. The red numbers show the workflow of the system developer, while the blue numbers represent the workflow of a researcher reproducing the system. to execute live system tests and experiments remotely 3. The basic architecture for executing simulated or real platform test on the Pepper robot is shown in Fig. 3. Software components may be deployed on an external Wi-Fi connected computing resource or on the onboard PC of the Pepper. This is abstracted through the middleware. The MORSE simulation environment [13] allows to conduct human-robot interaction experiments and provides virtual sensors for the cameras and laser-range sensors. The virtual image streams and laser scans are published on the equivalent ROS topics which are used by the real sensors. In Lier et al. [14], we show how to utilize this framework for an automated testing of a virtual human agent interferring with the navigation path of a robot. The development and deployment process by a researcher is illustrated in Fig. 4 (red numbers). It starts with the source code of her/his software components (Figure 4 (1)). These are often written in different programming languages and thus make use of diverse build environments. We address this issue by apply- 3 In order to gain access to our remote experiment execution infrastructure please contact the authors.

ToBI - Team of Bielefeld for RoboCup@Home 2016 7 ing a generator-based solution that utilizes minimalistic template-based descriptions (recipes) of the different components that belong to a system distribution (Figure 4 (2)). Distribution files (Figure 4 (3)) are interpreted by a generator that creates build jobs on a continuous integration (CI) server. Additionally, a special build job is created that, if triggered, orchestrates the complete build and deployment process of the system. After all jobs are finished, the system is deployed (Figure 4 (4)) in the file system and is ready to use (Figure 4 (5)). Since setting up a CI server and the required plugins takes time and requires expert knowledge, we provide prepackaged installations for CITK users. Moreover, we recently introduced deployment of CITK-based systems using Linux containers, like Docker. System descriptions and their meta data, e.g., source code locations, wiki pages, issue tracker, current build status, experiment descriptions, and so forth are frequently synchronized to a web-based catalog that also implements the CITK data model providing a global human readable and searchable platform which is a prerequisite for open research. 4 Conclusion We have described the main features of the architecture and technical solution of the ToBI systems for the RoboCup@Home Open Platform League (OPL) as well as Social Platform League (SSPL) 2017. BonSAI in combination with the Cognitive Interaction Toolkit (CITK) represents a flexible rapid prototyping environment, providing capabilities of robotic systems by defining a set of essential skills for such systems. The underlying middleware allows to extend it even to a distributed sensor network, here, defined by two service robots and an external computing resource. We further show the implementation of the overall framework as a reproducible system distribution for different robot platforms, like the GuiaBot or Pepper. The RoboCup@HOME competitions in 2009 to 2016 served for as a continuous benchmark of the newly adapted platform and software framework. In 2016, the ToBI robots gave the most stable performance throughout the competition and introduced new elements like reporting on success and failure of tasks and multi-robot cooperation. Key elements are the re-usable behavior definitions across platforms and a development approach that aims at reproducible robotic experiments and testing in simulation. This line of research will be continued in 2017 for OSPL as well as SSPL. References 1. Meyer zu Borgsen, S., Korthals, T., Lier, F., Wachsmuth, S. In: ToBI Team of Bielefeld: Enhancing Robot Behaviors and the Role of Multi-Robotics in RoboCup@Home. Volume 9776. Springer (2016) 2. Wrede, B., Kleinehagenbrock, M., Fritsch, J.: Towards an integrated robotic system for interactive learning in a social context. In: Proc. IEEE/RSJ Int. Conf. on Intelligent Robots and Systems - IROS 2006, Beijing (2006)

8 S. Wachsmuth et al. 3. Lohse, M., Siepmann, F., Wachsmuth, S.: A Modeling Framework for User-Driven Iterative Design of Autonomous Systems. International Journal of Social Robotics 6(1) (2014) 121 139 4. Amigoni, F., Reggiani, M., Schiaffonati, V.: An insightful comparison between experiments in mobile robotics and in science. Auton. Robots 27(4) (November 2009) 313 325 5. Meyer zu Borgsen, S., Korthals, T., Wachsmuth, S.: ToBI-Team of Bielefeld The Human-Robot Interaction System for RoboCup@Home 2016. (2016) 6. Lier, F., Hanheide, M., Natale, L., Schulz, S., Weisz, J., Wachsmuth, S., Wrede, S.: Towards Automated System and Experiment Reproduction in Robotics. In Burgard, W., ed.: 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), IEEE (2016) 7. Lier, F., Lütkebohle, I., Wachsmuth, S.: Towards Automated Execution and Evaluation of Simulated Prototype HRI Experiments. In: HRI 14 Proceedings of the 2014 ACM/IEEE international conference on Human-robot interaction, ACM (2014) 230 231 8. Wienke, J., Wrede, S.: A Middleware for Collaborative Research in Experimental Robotics. In: IEEE/SICE International Symposium on System Integration (SII2011), IEEE (2011) 1183 1190 9. Siepmann, F., Ziegler, L., Kortkamp, M., Wachsmuth, S.: Deploying a modeling framework for reusable robot behavior to enable informed strategies for domestic service robots. Robotics and Autonomous Systems 63 (2012) 619 631 10. Pitsch, K., Wrede, S.: When a robot orients visitors to an exhibit. Referential practices and interactional dynamics in the real world. In: Ro-Man 2014. (2014) 36 42 11. Bernotat, J., Schiffhauer, B., Eyssel, F.A., Holthaus, P., Leichsenring, C., Richter, V., Pohling, M., Carlmeyer, B., Kster, N., Meyer zu Borgsen, S., Zorn, R., Engelmann, K.F., Lier, F., Schulz, S., Bröhl, R., Seibel, E., Hellwig, P., Cimiano, P., Kummert, F., Schlangen, D., Wagner, P., Hermann, T., Wachsmuth, S., Wrede, B., Wrede, S.: Welcome to the future How naïve users intuitively address an intelligent robotics apartment. In: Proceedings of the 8th International Conference on Social Robotics (ICSR 2016). Volume 9979. (2016) 12. Richter, V., Carlmeyer, B., Lier, F., Meyer zu Borgsen, S., Kummert, F., Wachsmuth, S., Wrede, B.: Are you talking to me? Improving the robustness of dialogue systems in a multi party HRI scenario by incorporating gaze direction and lip movement of attendees. In: Proceedings of the Fourth International Conference on Human-agent Interaction, ACM Digital Library (2016) 13. Lemaignan, S., Echeverria, G., Karg, M., Mainprice, J., Kirsch, A., Alami, R.: Human-robot interaction in the morse simulator. In: Proceedings of the seventh annual ACM/IEEE international conference on Human-Robot Interaction, ACM (2012) 181 182 14. Lier, F., Lütkebohle, I., Wachsmuth, S.: Towards Automated Execution and Evaluation of Simulated Prototype HRI Experiments. In: HRI 14 Proceedings of the 2014 ACM/IEEE international conference on Human-robot interaction, ACM (2014) 230 231 15. Herbrechtsmeier, S., Korthals, T., Schöpping, T., Rückert, U.: A Modular & Customizable Open-Source Mini Robot Platform. In: 20th International Conference on Systems Theory, Control and Computing (ICSTCC), SINAIA, Romania (2016) 16. Roehlig, T.: Indoor room categorization using boosted 2d and 3d features. Master s thesis, University of Bielefeld, Citec, Bielefeld, Germany. (2014) Not published.

5 Team information ToBI - Team of Bielefeld for RoboCup@Home 2016 9 Name of Team: Team of Bielefeld (ToBI) Contact information: Sven Wachsmuth Center of Excellence Cognitive Interaction Technology (CITEC) Bielefeld University Inspiration 1, 33619 Bielefeld, Germany {swachsmu,semeyerz}@techfak.uni-bielefeld.de Website: https://www.cit-ec.de/tobi Team members: Sven Wachsmuth, Sebastian Meyer zu Borgsen, Florian Lier, Nils Neumann, Johannes Kummert, Dominik Sixt, Luca Michael Lach, Bjarte Feldmann, Felix Friese, Kai Konen, Lukas Hindemith, Robert Feldhans, Sarah Schröder, Sebastian Müller, Thilo Reinhold Description of hardware: GuiaBot by Adept/Mobilerobots (cf. section 2) Pepper by Softbank Robotics (cf. section 2) external computing resource (Laptop) connected by WiFi Description of software: Most of our software and configurations is open-source and can found at the Central Lab Facilities GitHub 4 Operating System Ubuntu 16.04 LTS Middleware ROS Kinetic; RSB 0.16 [8] SLAM ROS Gmapping Navigation ROS planning pipeline Object Recogntition Classificiation Fusion (CLAFU) [16] People Detection strands perception people 5 Behavior Control BonSAI with SCXML Attention Hierachical Robot-Independent Gaze Arbitration 6 Speech Synthesis Mary TTS Speech Recogntition PocketSphinx with context dependent ASR 4 https://github.com/centrallabfacilities 5 https://github.com/strands-project/strands_perception_people 6 https://github.com/centrallabfacilities/simple_robot_gaze