QUTIE TOWARD A MULTI-FUNCTIONAL ROBOTIC PLATFORM

Similar documents
INSTRUMENTATION AND SOFTWARE ARCHITECTURE FOR A SMART ROOM

NCCT IEEE PROJECTS ADVANCED ROBOTICS SOLUTIONS. Latest Projects, in various Domains. Promise for the Best Projects

* Intelli Robotic Wheel Chair for Specialty Operations & Physically Challenged

Perception. Read: AIMA Chapter 24 & Chapter HW#8 due today. Vision

C-ELROB 2009 Technical Paper Team: University of Oulu

Formation and Cooperation for SWARMed Intelligent Robots

ABSTRACT. Keywords: Ubiquitous computing, personal robot, human machine interface, control system.

Rapid Development System for Humanoid Vision-based Behaviors with Real-Virtual Common Interface

Real-Time Face Detection and Tracking for High Resolution Smart Camera System

Implementing Physical Capabilities for an Existing Chatbot by Using a Repurposed Animatronic to Synchronize Motor Positioning with Speech

University of Toronto. Companion Robot Security. ECE1778 Winter Wei Hao Chang Apper Alexander Hong Programmer

Simulation of a mobile robot navigation system

ENHANCED HUMAN-AGENT INTERACTION: AUGMENTING INTERACTION MODELS WITH EMBODIED AGENTS BY SERAFIN BENTO. MASTER OF SCIENCE in INFORMATION SYSTEMS

SPY ROBOT CONTROLLING THROUGH ZIGBEE USING MATLAB

Computer Vision in Human-Computer Interaction

Proseminar Roboter und Aktivmedien. Outline of today s lecture. Acknowledgments. Educational robots achievements and challenging

AI Application Processing Requirements

MULTI-LAYERED HYBRID ARCHITECTURE TO SOLVE COMPLEX TASKS OF AN AUTONOMOUS MOBILE ROBOT

Sensor system of a small biped entertainment robot

Contents. Mental Commit Robot (Mental Calming Robot) Industrial Robots. In What Way are These Robots Intelligent. Video: Mental Commit Robots

A SURVEY ON HCI IN SMART HOMES. Department of Electrical Engineering Michigan Technological University

2 Focus of research and research interests

A*STAR Unveils Singapore s First Social Robots at Robocup2010

Saphira Robot Control Architecture

RB-Ais-01. Aisoy1 Programmable Interactive Robotic Companion. Renewed and funny dialogs

DEVELOPMENT OF A ROBOID COMPONENT FOR PLAYER/STAGE ROBOT SIMULATOR

WIRELESS VOICE CONTROLLED ROBOTICS ARM

Building Perceptive Robots with INTEL Euclid Development kit

Team Description Paper: HuroEvolution Humanoid Robot for Robocup 2014 Humanoid League

Team Description Paper: HuroEvolution Humanoid Robot for Robocup 2010 Humanoid League

CONTACT: , ROBOTIC BASED PROJECTS

GESTURE RECOGNITION SOLUTION FOR PRESENTATION CONTROL

An Autonomous Self- Propelled Robot Designed for Obstacle Avoidance and Fire Fighting

On-demand printable robots

Development of Mörri, a high performance and modular outdoor robot

ROMEO Humanoid for Action and Communication. Rodolphe GELIN Aldebaran Robotics

Controlling Humanoid Robot Using Head Movements

A simple embedded stereoscopic vision system for an autonomous rover

Intuitive Vision Robot Kit For Efficient Education

VOICE CONTROLLED ROBOT WITH REAL TIME BARRIER DETECTION AND AVERTING

SMART ELECTRONIC GADGET FOR VISUALLY IMPAIRED PEOPLE

AGENT PLATFORM FOR ROBOT CONTROL IN REAL-TIME DYNAMIC ENVIRONMENTS. Nuno Sousa Eugénio Oliveira

Team Description Paper: Darmstadt Dribblers & Hajime Team (KidSize) and Darmstadt Dribblers (TeenSize)

Teleoperated Robot Controlling Interface: an Internet of Things Based Approach

Wheeled Mobile Robot Kuzma I

U ROBOT March 12, 2008 Kyung Chul Shin Yujin Robot Co.

Motion Control of a Three Active Wheeled Mobile Robot and Collision-Free Human Following Navigation in Outdoor Environment

Robotic Systems ECE 401RB Fall 2007

Blue Eyes Technology with Electric Imp Explorer Kit Ankita Shaily*, Saurabh Anand I.

GESTURE BASED HUMAN MULTI-ROBOT INTERACTION. Gerard Canal, Cecilio Angulo, and Sergio Escalera

1 Lab + Hwk 4: Introduction to the e-puck Robot

Devastator Tank Mobile Platform with Edison SKU:ROB0125

Semi-Autonomous Parking for Enhanced Safety and Efficiency

Space Research expeditions and open space work. Education & Research Teaching and laboratory facilities. Medical Assistance for people

International Journal of Advanced Research in Electrical, Electronics and Instrumentation Engineering. (An ISO 3297: 2007 Certified Organization)

Embedded Robotics Implementation

Group Robots Forming a Mechanical Structure - Development of slide motion mechanism and estimation of energy consumption of the structural formation -

ECE 511: MICROPROCESSORS

Korean Robot Standardization

Smart-M3-Based Robot Interaction in Cyber-Physical Systems

PERSONA: ambient intelligent distributed platform for the delivery of AAL Services. Juan-Pablo Lázaro ITACA-TSB (Spain)

Hybrid architectures. IAR Lecture 6 Barbara Webb

PSU Centaur Hexapod Project

GUIBDSS Gestural User Interface Based Digital Sixth Sense The wearable computer

Nao Devils Dortmund. Team Description for RoboCup Matthias Hofmann, Ingmar Schwarz, and Oliver Urbann

Haptic presentation of 3D objects in virtual reality for the visually disabled

Kissenger: A Kiss Messenger

Lecture 23: Robotics. Instructor: Joelle Pineau Class web page: What is a robot?

Home-Care Technology for Independent Living

Embedding Artificial Intelligence into Our Lives

An Integrated Modeling and Simulation Methodology for Intelligent Systems Design and Testing

Technology offer. Aerial obstacle detection software for the visually impaired

Service Robots in an Intelligent House

University of Florida Department of Electrical and Computer Engineering Intelligent Machine Design Laboratory EEL 4665 Spring 2013 LOSAT

Lab 7: Introduction to Webots and Sensor Modeling

Development of an Interactive Humanoid Robot Robovie - An interdisciplinary research approach between cognitive science and robotics -

DESIGN AND DEVELOPMENT OF LIBRARY ASSISTANT ROBOT

SPIDER ROBOT Presented by :

Open Source Voices Interview Series Podcast, Episode 03: How Is Open Source Important to the Future of Robotics? English Transcript

EROS TEAM. Team Description for Humanoid Kidsize League of Robocup2013

EE631 Cooperating Autonomous Mobile Robots. Lecture 1: Introduction. Prof. Yi Guo ECE Department

Creating a 3D environment map from 2D camera images in robotics

Hand Gesture Recognition System for Daily Information Retrieval Swapnil V.Ghorpade 1, Sagar A.Patil 2,Amol B.Gore 3, Govind A.

MAKER: Development of Smart Mobile Robot System to Help Middle School Students Learn about Robot Perception

ARCHITECTURE AND MODEL OF DATA INTEGRATION BETWEEN MANAGEMENT SYSTEMS AND AGRICULTURAL MACHINES FOR PRECISION AGRICULTURE

Mechatronics Educational Robots Robko PHOENIX

Cedarville University Little Blue

Behaviour-Based Control. IAR Lecture 5 Barbara Webb

Chuangze Intelligent Robot. Leading Enterprise of China Intelligent Service Robot. Chuangze Intelligent Robot Co. LTD.

Automatic Docking System with Recharging and Battery Replacement for Surveillance Robot

Flexible and Modular Approaches to Multi-Device Testing

KINECT CONTROLLED HUMANOID AND HELICOPTER

Retina 400 THE SIMPLE AND FLEXIBLE SOLUTION FOR HIGH DEFINITION RETINAL IMAGERY NON-MYDRIATIC RETINAL CAMERA C/D AUTO 60 H X 45 V LED. 2,2 mm.

Vishnu Nath. Usage of computer vision and humanoid robotics to create autonomous robots. (Ximea Currera RL04C Camera Kit)

Revolutionize the Service Industries with AI 2016 Service Robot

Robo-Erectus Jr-2013 KidSize Team Description Paper.

Emotional BWI Segway Robot

CIT Brains (Kid Size League)

Sensing and Perception

- Basics of informatics - Computer network - Software engineering - Intelligent media processing - Human interface. Professor. Professor.

Transcription:

QUTIE TOWARD A MULTI-FUNCTIONAL ROBOTIC PLATFORM Matti Tikanmäki, Antti Tikanmäki, Juha Röning. University of Oulu, Computer Engineering Laboratory, Intelligent Systems Group ABSTRACT In this paper we present an improved concept for building a complex robotics system with human-robot interaction capabilities. Qutie is a multi-functional robot designed and built in the Robotics Group at the University of Oulu. Qutie s applications include guiding in public places, demonstrating a laboratory and surveillance. Operating with humans requires more natural ways of interacting. In addition to spoken dialogs, consisting of speech recognition and synthesizing, we have incorporated different ways of passing information to the robot, activating operation and describing tasks. This includes, for example, tactile sensors on the robot's body, machine vision for detecting human faces and gestures. The operation of the robot was demonstrated successfully at the IST2006 conference, and Qutie is operating as a guide in our laboratory. 1. INTRODUCTION During the last couple of years, the number of interactive robots has in both the toy industry and in service applications has increased significantly. From the point of view of the man in the street, robots are no longer only laborers in a factory. Robots can be found cleaning homes and public places (e.g. Electrolux s Trilobite) [9], cutting grass (e.g. Electrolux s Automower) and amusing people with humanlike gestures. One can say that these robots have even somehow come into vogue. Different kinds of toy robots can already be found in most stores that sell toys or domestic appliances. There are also more complex robots available for ordinary people: a Japanese company is renting a guide robot, Ubiko [10], for events. Ubiko is controlled by voice and responds responds quickly to user commands, as well. Its main tasks are welcoming guests and promoting products. The growing robot markets require more and more new features from robots. People s expectations of the capabilities of robots in stores are not high. To get people interested in buying robots and using them in everyday life, robots will have to get more functionalities. As a starting point for interactive robot -studies, several aspects can be taken into account. For example, is the robot a tool for something or should it act as a companion for people. Often people also expect a robot to be smart or even somehow clever. Most expectations are based on Sci-Fi, but topmost it can be seen that people want their robots to somehow be individuals, that is, robots should have some kind of personality and habits of their own. In seeking at solution for the complex problem involving workable Human-Robot Interaction (HRI), it is easiest to start from the robot s side. An example is Kismet[4], which is designed to make friend and influence people by using physical methods. Breazeal and Scasselati write that "the robot must make the human believe that it has beliefs, desires, and intentions". The idea that the robot must convince the human of having deeper intentions is making HRI -designing easier. The researcher can trust that if we concentrate on making the robot s appearance and gestures believable enough, human nature will do the rest. This is based on the human capability of adapting to new things. Qutie is a multi-functional robot designed and built in the Robotics Group at the University of Oulu. Human expectations have been a starting point in developing Qutie (Figure 1) to be more human-friendly. Communication between the user and the robot is based on more natural ways of passing information, e.g. touch, recognition of gestures and finding a human face. Primitive ways of communicating, are suited to all users, whether they are children or adults.

Figure 1. Qutie robot This article introduces the main features of the latest version of Qutie. Chapter 2 introduces the robot s hardware, design and electronics. The modularity of the system is also described within the framework of the development of the robot. The theory of the software architecture is explained in chapter 3. This chapter also includes information about the tools used in the project (QT library, machine vision methods and speech synthesizing). Chapter 4 introduces the robot in action, also including the first experiences with performance and some future development plans. Finally, some conclusions on the design work are briefly discussed. 2. QUTIE, MULTI-FUNCTIONAL ROBOTIC PLATFORM Building a complex robot -system is often a time- and resource-consuming process. Designing the mechanics, electronics and appearance takes time and all this delays the making of the content, which is software. Using modular methods to implement the electronics and software of a robot shifts the balance of the whole robot project to content-making rather than making the base. This kind of modular robot construction might be the key to taking robots to the next, more intelligent, level. In this paper we present a concept for mobilizing a modular and multi-functional robot in the real world. 2.1. Concept The framework for the interactive and friendly Qutie, consists of three basic points: Regulating units, the appearance of the robot and the software architecture used in the robot. With the help of regulating units and the friendly appearance, we have built a robot that gets people interested in interacting with the system. The modularity of the system helps us to expand and modify the system for development or even for completely new tasks (e.g. surveillance). Modularity is implemented on both the software (Property System Architecture, PSA) and the electronics (Atom i-modules) level, and it is described in Chapter 2.4. The concept of the multifunctional, modular and interactive robot -system is shown in Figure 2. The Human-Robot interface is implemented using Skype, machine vision and a touchsensitive EMFi -layer, which are all also shown in Figure 2. 2.2. Hardware Qutie is a 110 cm high mobile robot with a 3-DOF head. The robot is equipped with, among other things, a firewire -camera in its head and a laser -sensor on the front side. Communication between the robot and the user can be handled with speakers, a belly screen, touch sensors or machine vision methods. Qutie also has a removable fur skin to make the fiberglass chassis softer and more comfortable to touch. Qutie Figure 2. Concept of Qutie Antennas EMFi-layer Figure 3. Hardware modules of the Qutie -robot. Qutie s frame is made of steel and aluminium. The mobility of the robot is solved by building it on top of a Nomadic Super Scout II. All except the motors, wheels, batteries and basic skeleton have been removed from the Scout to make the base as light as possible. Navigation is based mainly on sensor data from the wheels and information from a laser -sensor. The 240- degree eyesight of the laser -sensor is enough for avoiding obstacles in the driving direction. An overall view of the system components is shown in Figure 3. 2.3. Design User Text gestures Touch SKYPE Machine vision EMFi-layer Speaker Bellyscreen Eyes and antennas Atomi Atomi-boards - Property Service Computer - PSA - QT RGB-Led eyes Camera Touchscreen Laser The robot has been designed in cooperation with industrial design students from the University of Lapland. The goal of this cooperation was to examine possible human-friendly appearances for robots. Essentially, the main considerations were possible shapes and materials that would make the robot easy to approach for humans.

The design of the present Qutie was chosen from several different alternatives. The reasons for choosing this particular model were mostly its playful appearance and shape for potential future development. To make people regard a robot as a playful and human-friendly thing, the robot must not only look nice but also act and sound nice. This makes the robot easy to approach for a human. We improved the original design with eyes that are equipped with servos for movement and RGB leds that show different colors. With the help of adjustable eyes and two antennas on top of the robot s head, Qutie can show three basic moods; sad, happy and angry. These three basic emotions offer a good base for creating a personality for the robot. 2.4. Modularity As mentioned earlier, modularity makes the development process faster and easier. One of the early goals in the Qutie project has been research on modular methods for building complex mechatronic systems [2]. Modularity is important for several reasons, but most of all, it makes expansion and modification possible by changing the configuration of modules instead of the complete system. In a laboratory environment, modularity also offers advantages in the reusability and maintainability of system modules. Modules can be recycled into other projects and if a module is malfunctioning, only the one in question needs to be replaced. Modularity in the Qutie robot is based on two major items: the Atomi concept and Property service. The Property service architecture is presented in section 3 with a description of the software. The rapid system development and prototyping inspired our laboratory to develop an object-oriented embedded system development method, which is based on small embedded objects called Atomis [2]. In brief, Atomis are electronic boards that consist of sensor circuits, actuator drivers, or other functionalities. The Atomi concept contains several modules that all have a specific task on the board. The boards can be stacked together to build up whole systems. The boards are interconnected through a simple field bus that is extended with a common voltage supply line, and this interconnection is connected to the computer via USB. In Qutie we have Atomis in three different places, with altogether six different Atomi -modules. The modules can be seen in Figure 4. SERVO-Atomi I/O-Atomi (Leds) AS-Atomi (EMFi) USB-Atomi (Head) SERVO-Atomi AD-Atomi Main PC USB-Atomi (Neck) DC-Atomi USB-Atomi (Base) Figure 4. Atomi modules of the Qutie robot. The Atomimodules are interconnected and connected to the computer via USB. From previous work [2], several improvements have been made by changing the Atomis to more reliable Atomi v2 versions [5]. The configurations of the Atomis have also been changed, so that now we have 3 different Atomi Objects, for the head, the neck and the base of the robot. 2.5. Development The Qutie robot has gone through a long development phase. Most of the improvements compared with the original version [1] are related to updating the technical equipment of the robot, for example the computer, camera and electronics. There have been several updating rounds, like replacing the original single-board computer of the Scout, running at 200MHz, with an 800MHz Pentium 3 Single-Board computer. Later on the computer has been updated to a 2.0 GHz dual-core processor with an Intel T2500 core. This improves performance and provides the possibility to implement advanced machine vision algorithms onboard. The biggest development in the robot has been updating of the robots head from a 2-DOF (with stepper motors) to a 2-DOF servo-driven neck mechanism. This rearrangement has made many new things possible, especially regarding the machine vision system and overall interaction with the surroundings. The machine vision system use a FireWire camera instead of the earlier low-cost USB camera. This makes different kinds of recognition methods more efficient in the HRI region and brings a dramatic improvement to picture quality. Interaction between the human and the robot has itself become the most important task for future planning. To rationalize the HRI, a touch-sensitive EMFi -layer [6] has been deployed on the robot s body surface. 3. SOFTWARE ARCHITECTURE Qutie contains several operating units which all need software for operating. Adding functionalities into the robot -systems after the first version may become critical if the structure of the system is not planned systematically enough from the beginning. The complexity of Qutie and the possibility of extension are handled with dynamic and modular software architecture [7]. This architecture is called Property Service. Property Service Architecture provides several types of architectures for the software. In Qutie, hybrid architecture containing fully reactive and sequential parts has been used. This is discussed more in Chapter 3.2 3.1. Property Service Property Service Architecture (PSA) [2] is based on the specific architecture of the system, where each device (e.g. servos, laser and driving motors) provides a service.

Each service contains a set of properties, including features, configuration and functionalities related to the device. PSA provides a simple interface for setting and getting different values of the device and for activating the operation of the device. Each device contains a set of properties that can be "get" and "set". PSA is used for communication between processes, and for remote operation of the robot. PSA also provides interfaces for implementation of distributed systems [7], where services are distributed among different machines. It has been successfully used, e.g. in remote operation of heterogeneous robots and operation of Robot Swarm operation [8]. The architecture also provides several higher -level control services, and abstract data storages. In this work, PSA is mainly used for communication between processes inside the robot. Each device of the robot can be used through the PSA interface. In Qutie, each Atomi provides a set of functionalities, and PSA combines these into a service. As Atomi modules can be attached during operation, new devices and their properties show up on the list of properties of the service immediately. This way, the configuration of the robot can be changed during operation. The advantages of the Property Service lie in both modularity and the abstraction level of information. Different robots can be used with the same higher level control software. For example, commanding the robot to "move one meter forward" has the same effect regardless of the robot s size or whether it uses wheels or legs for moving. In Qutie s case, with the help of Property Service, other kinds of robots can be operated with the developed methods, as long as their architecture is based on Property Service Architecture. Figure 5 shows a simple example of set - and get -commands in Property Service. set mood happy set movement.target (10 10 0) get movement get location Figure 5. Set - and get -commands in PSA. 3.2. QT QT [11] is a C++ class library for developing applications. QT includes hundreds of ready-made classes based on various design patterns. The robot s architecture is a hybrid architecture containing reactive and sequential parts. The software in the robot has been implemented using C++ and a QT library. Probably the most useful pattern is the Signal-Slot concept, where class outputs and inputs can be connected and disconnected during operation. In Qutie, this is used to define reactions to certain inputs. For example, a sensor value is emitted to an actuator control slot. Signals and slots provide an easy way to create reactive functionalities for a robot. The sources and targets of signals can also be either lower -level connections, like sensors and actuators, or higher - level methods, like emotion stimulations, control algorithm parameters, etc. For example, we can connect an EMFi -sensor output signal to an emotion stimulation slot, so that each pressure change stimulates the robot's mood. An example of the connect signal is presented in Figure 6. connect (comm, SIGNAL( ad1(float) ), emotions, SLOT( stimulatehappiness(float)) Figure 6. Connect-signal in QT 3.3. Machine vision An essential part of the robots architecture is the machine vision module: Qutie has a multipurpose vision system onboard. Several basic algorithms have been implemented to provide the possibility to detect both humans and several objects. The vision system includes face detection, shape detection, color detection and texture detection. Each method provides an output in a unified format called markers, which are used in higher -level controls. A marker is a tracking unit/structure that contains location and detected features. For example, the robot sees a ball and the vision system produces a marker for the ball, including its estimated location and detected features of its color and shape. The features are used for simple classification of targets and as an aid in human interaction. Similarly, face detection gives a marker that tracks a human. 3.4. Belly screen In addition to visualization of the vision system, the robot can show different kinds of information on the screen. The user interface of the software contains a Widget for showing symbols and text on the screen. The color and background of this full screen view can be set according the information shown. Commonly used colors emphasize messages. For example, for urgent messages, the background can be set to Red and use blinking symbols. 3.5. Speech synthesizing Speech is one of the most natural ways to affect the surroundings. The development of the speech system in Qutie is still in the beginning, but the first results can already be given. The robot s software contains a simplified dialog system (shown in Figure 7). Qutie uses a few predefined commands and responses, which have been given to the dialog module. The dialog module contains simple subject detection and parameter extraction from predefined forms of sentences. We have also implemented a Skype interface to provide a simplified telepresence. A remote user can call Qutie s Skype -account and use Skype -chat to control the robot with textual commands. The dialog also uses several commonly used emotions (like smileys ":)" and ":(") and changes the robot s expressions accordingly. Different detected parameters can be connected to the rest

of the software system using signals and slots. A remote operator can, for example, send a sequence of several property calls through chat to create complex operations. This can be used to produce an output text. The latest Skype-version also provides a video phone feature, but it is not available on Linux -Skype yet. Later, it will provide the possibility to use Qutie as a mobile video phone, also. Voice regognition Audio Signals TXT Slots DIALOG Figure 7. Quties dialog -system. 4. QUTIE IN ACTION In Qutie, we have already tested a few methods for solving problems like navigating in an environment, getting information through sensors (touch-sensors, laser) and of course, getting the system to work together with modularity (electronic, computer and software). Expanding the system has also been tested successfully by adding more sensors to the system. 4.1. Qutie, friendly guide robot Qutie software Audio Text to Speech Skype API The first real task for Qutie, after the building phase, is giving guided tours in the Robotics laboratory at the University of Oulu. In Brief the guiding -task includes four main parts; Moving, talking, sensing touch and using machine vision to detect obstacles and human -faces. The guiding task offers a challenging field for developing the robot. There are several problems to solve: How to make the robots information delivery clear enough to be understood by humans? How to make the presentation more colorful (Only monotone speech would make the presentation very dull)? These kinds of concerns make the guiding -robot task not only a technical, but also a psychological issue. Building a guiding robot started from the idea of developing a robot for presenting the Robotics laboratory of the University of Oulu. There are many visitors in the laboratory all year round, and people come to see and hear about robots and research work in robotics. The idea of making a robot to do the presentation work made sense, but it also offered a great opportunity to do research on people s expectations and thoughts about robots. Qutie has already done some public performances. The first and biggest event for Qutie was the European s Information Society s IST -event on 21.-23.11. 2006, which was held in Helsinki [12]. Qutie was part of an exhibition for Oulu Visions (Robotics Group and Machine Vision Group), and it played a noticeable role in the middle of the exhibition area. Qutie demonstrated its machine vision system and talked to visitors walking by. The IST-event showed that people are really interested in robots and they are really open to having robots in different areas of society. The event was also very challenging, because the lighting of the showground consisted of bright lights in various colors (green, pink, orange). The difficult lighting disturbed the face tracking of the machine vision system. The loud background noise also made it difficult to hear what the robot said. 5. CONCLUSION Qutie has been upgraded from its earlier version. It contains more features and sensors for making the operation of the robot more efficient. The first public performances have been made and they have given lots of feedback from to the project. Several tests with different configurations of the robot have been made. The modularity needed to expand the system has been tested by adding components during the project (servos for antennas and extra EMFi -layers). People s expectations of robots are often related to appearance: Interaction skills should correspond to the appearance of the robot. If a robot s appearance is very human-like, it should also be able to communicate like a human. If a robot looks very playful, it makes sense that the robot also acts like it. A dialog subsystem provides easily expandable textual and spoken interaction. Using Skype API, we provide a commonly used way to remotely operate the robot without a need to install other software on the remote machine. One of the advantages of Skype is also that it has been designed for use through firewalls, and therefore is easier to use and connect. ACKNOWLEDGEMENTS This research was partially funded by Academy of Finland and Infotech of Oulu. 6. REFERENCES [1] A. Tikanmäki, J. Riekki, J. Röning. Qutie - an interactive mobile robot, ICAR 2003 International Conference on Advanced Robotics, Jun 30 - Jul 4, Coimbra, Portugal. (2003). [2] A. Tikanmäki, T. Vallius, J. Röning, Qutie - Modular methods for building complex mechatronic systems, ICMA - International Conference on Machine Automation, Nov. 24.- 26., Osaka, Japan (2004). 4.2. First public performance

[3] G. Schweittzer, What Do We Expect from Intelligent Robots?, Proceedings of the 1999 IEEE/RSJ International Conference on Intelligent Robots and Systems (1999). [4] Cynthia Breazeal, Brian Scassellati How to build robots that make friends and influence people (1999). [5] T. Vallius, J. Röning, ATOMI II - Framework for Easy Building of Object-oriented Embedded Systems, Proc. 9th Euromicro Conference on Digital System Design: Architectures, Methods and Tools, Aug 30 - Sep 1, Dubrovnik, Croatia, p. 464-472. (2006). [6] http://www.emfi.net (19.2.2007) [7] T. Mäenpää, A. Tikanmäki, J. Riekki, J. Röning, A Distributed Architecture for Executing Complex Tasks with Multiple Robots, IEEE 2004 ICRA, International Conference on Robotics and Automation,Apr 26 - May 1, New Orleans, LA, USA (2004). [8] A. Tikanmäki, J. Haverinen, A. Kemppainen, J Röning. Remote-operated robot swarm for measuring an environment, ICMA 2006 - International Conference on Machine Automation, Jun 7-8, Seinajoki, Finland. (2006). [9] http://www.electrolux.com (22.2.2007) [10] http://www.ubiko.jp (22.2.2007) [11] http://www.trolltech.com (23.2.2007) [12] http://www.ist2006.fi/ (23.2.2007) - Conference Proceedings -