A simple embedded stereoscopic vision system for an autonomous rover

Similar documents
Canadian Activities in Intelligent Robotic Systems - An Overview

AN HYBRID LOCOMOTION SERVICE ROBOT FOR INDOOR SCENARIOS 1

RECONFIGURABLE SLAM UTILISING FUZZY REASONING

Distributed Vision System: A Perceptual Information Infrastructure for Robot Navigation

ROBOTICS 01PEEQW. Basilio Bona DAUIN Politecnico di Torino

Rapid Development System for Humanoid Vision-based Behaviors with Real-Virtual Common Interface

CAPACITIES FOR TECHNOLOGY TRANSFER

Sensor system of a small biped entertainment robot

Motion Control of a Three Active Wheeled Mobile Robot and Collision-Free Human Following Navigation in Outdoor Environment

MEMS Accelerometer sensor controlled robot with wireless video camera mounted on it

NCCT IEEE PROJECTS ADVANCED ROBOTICS SOLUTIONS. Latest Projects, in various Domains. Promise for the Best Projects

Autonomous Stair Climbing Algorithm for a Small Four-Tracked Robot

DEVELOPMENT OF THE AUTONOMOUS ANTHROPOMORPHIC WHEELED MOBILE ROBOTIC PLATFORM

* Intelli Robotic Wheel Chair for Specialty Operations & Physically Challenged

Available theses (October 2011) MERLIN Group

ESTEC-CNES ROVER REMOTE EXPERIMENT

Verified Mobile Code Repository Simulator for the Intelligent Space *

Hybrid Positioning through Extended Kalman Filter with Inertial Data Fusion

FU-Fighters. The Soccer Robots of Freie Universität Berlin. Why RoboCup? What is RoboCup?

COS Lecture 1 Autonomous Robot Navigation

The Architecture of the Neural System for Control of a Mobile Robot

Design of Tracked Robot with Remote Control for Surveillance

Developing a Computer Vision System for Autonomous Rover Navigation

EE631 Cooperating Autonomous Mobile Robots. Lecture 1: Introduction. Prof. Yi Guo ECE Department

International Journal of Modern Trends in Engineering and Research e-issn No.: , Date: April, 2016

ECE 511: MICROPROCESSORS

Kid-Size Humanoid Soccer Robot Design by TKU Team

Design and Control of a Self-Balancing Autonomous Underwater Vehicle with Vision and Detection Capabilities

Mission Applications for Space A&R - G.Visentin 1. Automation and Robotics Section (TEC-MMA)

SELF-BALANCING MOBILE ROBOT TILTER

Randomized Motion Planning for Groups of Nonholonomic Robots

Proseminar Roboter und Aktivmedien. Outline of today s lecture. Acknowledgments. Educational robots achievements and challenging

Training Schedule. Robotic System Design using Arduino Platform

Keywords: Aircraft Systems Integration, Real-Time Simulation, Hardware-In-The-Loop Testing

Double-track mobile robot for hazardous environment applications

Control System for an All-Terrain Mobile Robot

Key Areas for Collaboration

An Autonomous Self- Propelled Robot Designed for Obstacle Avoidance and Fire Fighting

Mobile Robots (Wheeled) (Take class notes)

UNIVERSIDAD CARLOS III DE MADRID ESCUELA POLITÉCNICA SUPERIOR

PLANLAB: A Planetary Environment Surface & Subsurface Emulator Facility

Team KMUTT: Team Description Paper

Advanced Robotics Introduction

MULTI-LAYERED HYBRID ARCHITECTURE TO SOLVE COMPLEX TASKS OF AN AUTONOMOUS MOBILE ROBOT

Eurathlon Scenario Application Paper (SAP) Review Sheet

ROBOTICS 01PEEQW. Basilio Bona DAUIN Politecnico di Torino

Robocup Electrical Team 2006 Description Paper

AN AUTONOMOUS SIMULATION BASED SYSTEM FOR ROBOTIC SERVICES IN PARTIALLY KNOWN ENVIRONMENTS

Funzionalità per la navigazione di robot mobili. Corso di Robotica Prof. Davide Brugali Università degli Studi di Bergamo

On-demand printable robots

SCOE SIMULATION. Pascal CONRATH (1), Christian ABEL (1)

Lab 7: Introduction to Webots and Sensor Modeling

A NOVEL CONTROL SYSTEM FOR ROBOTIC DEVICES

Behaviour-Based Control. IAR Lecture 5 Barbara Webb

Available theses (October 2012) MERLIN Group

1 st IFAC Conference on Mechatronic Systems - Mechatronics 2000, September 18-20, 2000, Darmstadt, Germany

ROBOTICS ENG YOUSEF A. SHATNAWI INTRODUCTION

MECHATRONICS SYSTEMS

Online Monitoring for Automotive Sub-systems Using

What is a robot. Robots (seen as artificial beings) appeared in books and movies long before real applications. Basilio Bona ROBOTICS 01PEEQW

Future Intelligent Machines

DEVELOPMENT OF A BIPED ROBOT

FLCS V2.1. AHRS, Autopilot, Gyro Stabilized Gimbals Control, Ground Control Station

Distributed Robotics From Science to Systems

DC motor using multi activation wavelet network (MAWN) as an alternative to a PD controller in the robotics control system

Automation & Robotics (A&R) for Space Applications in the German Space Program

Prospective Teleautonomy For EOD Operations

Undefined Obstacle Avoidance and Path Planning

Design of a Remote-Cockpit for small Aerospace Vehicles

Robotics Introduction Matteo Matteucci

Creating a 3D environment map from 2D camera images in robotics

Service Robots in an Intelligent House

estec PROSPECT Project Objectives & Requirements Document

Smart eye using Ultrasonic sensor in Electrical vehicles for Differently Able.

Dipartimento di Elettronica Informazione e Bioingegneria Robotics

A PROTOTYPE CLIMBING ROBOT FOR INSPECTION OF COMPLEX FERROUS STRUCTURES

Development of a Novel Zero-Turn-Radius Autonomous Vehicle

AUTOMATION & ROBOTICS LABORATORY. Faculty of Electronics and Telecommunications University of Engineering and Technology Vietnam National University

NAVIGATION OF MOBILE ROBOT USING THE PSO PARTICLE SWARM OPTIMIZATION

All theses offered at MERLIN (November 2017)

JNTU World. Introduction to Robotics. Materials Provided by JNTU World Team. JNTU World JNTU World. Downloaded From JNTU World (

Controlling Obstacle Avoiding And Live Streaming Robot Using Chronos Watch

Learning Reactive Neurocontrollers using Simulated Annealing for Mobile Robots

Implementation and Performance Evaluation of a Fast Relocation Method in a GPS/SINS/CSAC Integrated Navigation System Hardware Prototype

Joint Open Lab and PHD proposal

INTELLIGENT UNMANNED GROUND VEHICLES Autonomous Navigation Research at Carnegie Mellon

Beamforming and Synchronization Algorithms Integration for OFDM HAP-Based Communications

Design and Control of the BUAA Four-Fingered Hand

RAPID CONTROL PROTOTYPING FOR ELECTRIC DRIVES

Husky Robotics Team. Information Packet. Introduction

Robot: Robonaut 2 The first humanoid robot to go to outer space

Adaptive Humanoid Robot Arm Motion Generation by Evolved Neural Controllers

Intelligent driving TH« TNO I Innovation for live

Welcome to this course on «Natural Interactive Walking on Virtual Grounds»!

A Lego-Based Soccer-Playing Robot Competition For Teaching Design

Speed Control of a Pneumatic Monopod using a Neural Network

Advanced Robotics Introduction

Space Research expeditions and open space work. Education & Research Teaching and laboratory facilities. Medical Assistance for people

Skyworker: Robotics for Space Assembly, Inspection and Maintenance

Team Autono-Mo. Jacobia. Department of Computer Science and Engineering The University of Texas at Arlington

Transcription:

In Proceedings of the 8th ESA Workshop on Advanced Space Technologies for Robotics and Automation 'ASTRA 2004' ESTEC, Noordwijk, The Netherlands, November 2-4, 2004 A simple embedded stereoscopic vision system for an autonomous rover b GIANCARLO GENTA, c MARCELLO CHIABERGE, b NICOLA AMATI, b MAURO PADOVANI, c CLAUDIO SANSOE, a PAOLO ROLANDO a Mechatronics Laboratory, b Department of Mechanics, c Department of Electronics, Politecnico di Torino, Corso Duca degli Abruzzi, 24 10129, Torino, Italy ABSTRACT A number of demonstrators of a legged microrover for planetary exploration, also suitable for terrestrial applications, mainly for operation in hazardous environments, were developed by the authors. The version 6.2, which was widely experimented both in laboratory and in outdoors Mars analogue environments, operates autonomously using touch sensors. The performances of the vehicle can be increased if teleoperated for trajectory planning and obstacle prediction. An improvement of the autonomy goes in the same direction. The present paper describes the on-board vision system, developed with the aim of increasing the autonomy of the machine. 1 INTRODUCTION Rovers operating autonomously at long distance from the human driver must be supported by robust control and navigation systems. The motion of Walkie 6 [1-5], (Figure 1), a twin rigid frames walking micro rover, developed at the Mechatronics Laboratory of the Politecnico di Torino, in cooperation with Alenia Spazio, is controlled by a finite state automata allowing to perform autonomous walking on rough terrains. The existing low-level controller is limited by the lack of supporting tools for autonomous operation in unstructured environments. The present paper describes the stereoscopic vision system, installed on board, and the navigation simulator to be used by the human operator to plan in advance a strategy suited to the terrain to be crossed. 1

Figure 1: The Walkie 6.2 demonstrator being tested in a Mars-analogue environment (volcanic terrain on Mount Etna). The hardware and the control algorithm of the stereoscopic vision system were developed to be integrated with the hardware electronics to simplify the communication and to reduce the power required for computation. Experimental tests showed that the conformation of the obstacles close to the vehicle and their distance could be defined with a sufficient accuracy and at a low computational cost. Such information is computed in real time by the algorithm and is used by the onboard control system to define the correct trajectory that the rover must follow to reach the goal within the shortest time and with minimum energy consumption. The images of the micro cameras, which are sent by radio link to the human operator, are used to reproduce the environment in which the vehicle is operating to study in advance the appropriate navigation strategy. Such a strategy can be implemented on board by radio link as the new electronic hardware (which is under construction as version 6.3) uses programmable logic devices (FPGA). Specific software (commercially available) reproduces and updates the virtual environment on the base of visual information. Information of the vision system represent an important input for the simulator of the rover widely described in [6] and used in the second part of the present paper to study advantages and drawbacks of several control and navigation strategies. Experimental validation of the model is presented in [6] when the vehicle works on flat surfaces while the comparison of numerical and experimental results of the rover walking on rough terrain are described in the present article. The simulator, developed using MATLAB/SIMULINK software, was validated experimentally using the version 6.2 of the prototype. 2 VISION SYSTEM The aim of the stereoscopic vision system here below described is to equip Walkie 6 with a tool able to provide both scientific images and information for the onboard control system and for the human driver operating at the control station. The first kind of images are simply processed, compressed and downloaded to the control centre trough the telemetry communication channel available on the rover. The second kind of images are directly 2

processed onboard the rover and provide the control platform with several information about terrain, obstacles, distances, etc. Figure 2: a) Walkie 6.3 stereoscopic vision system hardware, b) block diagram of the Walkie 6.3 vision system. Figure 3: Example of acquired images: left and right cameras. This information is used to modify the trajectory of the rover on the terrain in order to optimize velocity, movements and save energy. If the navigation system fails to avoid an obstacle, the rover has a low level obstacle avoidance system that use contact information to change direction when in front of a dangerous obstacle (stone, hole or a dangerous slope). The stereoscopic configuration (Figure 2a and b) of the vision system helps to find distance information: it is composed of two CMOS cameras (PB300), a control logic implemented on a FPGA device (for camera handling and configuration) and a DSP processor (for image management analysis and compression). The vision system algorithm is based on the similarity of the two images acquired, due to a distance between the two PB300 sensors of only 10 cm. This similarity is used to optimise the compression and for distance extraction of visible and close obstacles. Using this kind of images, the distance of the obstacle (Figure 3), a big stone in this case, is proportional to the inverse of the difference of the two positions (clearly visible in the two images). With the 3

same procedure it is also easy to compress the images before transmission: the left image and, instead of the right image, the optimised difference between the two is transmitted. The vision system is actually provided with a console running under Windows OS (Figure 5) where the operator can manage all the different settings, download the images and extract Figure 4: Host interface and obstacle monitor. distance information. In the next version the system will be totally integrated with the new electronic subsystem and will provide navigation information directly to the control platform. The new Walkie 6.3 electronic subsystem is completely different from the previous versions based on a µp controlling the strategy, the movements and the communication links of the rover. The new architecture is structured in layers to allow users to customize the functions of the rover. The action layer (low level interface between sensors and motors) is based on a programmable logic device (USER PLD) that implements a finite state automata controlled by several parameters. Those parameters are handled by the on-board microprocessors (80C51, space application certified) that is the supervisor of the entire electronic subsystem. The finite state automata (FSA) is an example of a possible rover controller: user can easily modify the FSA or completely change the control architecture (the USER PLD is completely available). Another PLD device is responsible of the service layer. All the on-board services are available to the user: communication links (bidirectional for commands and telemetry), accelerometers data handling and vision system (actually implemented as a separated 4

resource: will be integrated in the next version). All these services are considered fixed, not changeable by the user: next versions of the electronic subsystem will implement more service parts to offer more flexibility to the user ([7]). The implemented structure allows the user to store different user control strategies in the onboard configuration device (handled by the 80C51) and download on the USER PLD only when necessary: this feature improves fault tolerance and flexibility of the electronic subsystem and allows the user to virtualize the control electronics (a maximum of eight possible configuration is allowed). Conclusions The addition of a simple vision system able to recognize the obstacles and to measure their distance before the rover actually touches them is a worth addition to the demonstrators of the WALKIE 6 line. The hardware and software, which has been built and tested, proved to be effective enough to improve the performances of the machine. This, together with the increase of the memory storage capabilities built in the electronics developed for the 6.3 version, which allows to store different control strategies to be implemented and used when needed, will increase the rover velocity while decreasing the energy consumption. The simulator built and validated on the 6.2 version of the rover proved to be effective in testing control strategies and optimising the parameters of the system. The same simulator, however, has also another use: when operating at interplanetary distances, it can give the human operator/supervisor a real-time (and predictive) knowledge of the current situation of the rover. It can therefore be useful for fault detection. Only experimental tests, which include the relevant control delay, will show whether this combination of autonomy and human control is able to overcome the difficulties due to the operation on severe environments with large distances and therefore long time delay. References [1] L.Bussolino, D. Del Corso, G. Genta, M.A. Perino, R.Somma, 1997, ALGEN - A Walking Robotic Rover for Planetary Exploration, Int. Conf. on Mobile Planetary Robots & Rover Roundup, Santa Monica. [2] Amati N., Chiaberge M., Genta G., Miranda E., Reyneri L.M., 1999, Twin Rigid-Frames Walking Microrovers: a Perspective for Miniaturization, Journal of the British Interplanetary Society, Vol. 52, n. 7/8, pp. 301-304. [3] Amati N., Chiaberge M., Genta G., Miranda E., Reyneri L.M., 2000, WALKIE 6 A Walking Demonstrator for Planetary Exploration, Space Forum, Vol. 5, n.4, pp. 259-277. [4] Genta G., Amati N., 1998, Performance Evaluation of Twin Rigid-Frames Hexapod Planetary Rovers, Fourth Int. Conf. on Motion and Vibration Control, Zurigo. [5] Genta G., Chiaberge M., Amati N., Non Zoomorphic Rigid Frames Walking Micro-Rover: from a Demonstrator to an Engineering Prototype, Proceedings of the International Conference on Smart Technology Demonstrators & Devices, Edimburg, 12-14 Dicembre 2001. [6] Genta G, Amati N., Padovani M., Performance of twin rigid frames walking rover on uneven ground-simulation and experimental tests, Proceedings of the 5th Int. Conference on Climbing and Walking Robots, Paris, September 2002, pp. 515-522. [7] M. Chiaberge, W. Santero, D. Amerio, 2001, Digital Solutions for reprogrammability, SCI'2001, The Fifth Multi-Conference on Systemics, Cybernetics and Informatics, Orlando. 5