Localisation et navigation de robots

Similar documents
What is Robot Mapping? Robot Mapping. Introduction to Robot Mapping. Related Terms. What is SLAM? ! Robot a device, that moves through the environment

Robot Mapping. Introduction to Robot Mapping. Cyrill Stachniss

Artificial Beacons with RGB-D Environment Mapping for Indoor Mobile Robot Localization

COOPERATIVE RELATIVE LOCALIZATION FOR MOBILE ROBOT TEAMS: AN EGO- CENTRIC APPROACH

Abstract. This paper presents a new approach to the cooperative localization

Multi-Robot Cooperative Localization: A Study of Trade-offs Between Efficiency and Accuracy

Lecture: Allows operation in enviroment without prior knowledge

Preliminary Results in Range Only Localization and Mapping

Robot Mapping. Introduction to Robot Mapping. Gian Diego Tipaldi, Wolfram Burgard

Stergios I. Roumeliotis and George A. Bekey. Robotics Research Laboratories

Decentralised SLAM with Low-Bandwidth Communication for Teams of Vehicles

Localization for Mobile Robot Teams Using Maximum Likelihood Estimation

FEKF ESTIMATION FOR MOBILE ROBOT LOCALIZATION AND MAPPING CONSIDERING NOISE DIVERGENCE

Minimizing Trilateration Errors in the Presence of Uncertain Landmark Positions

Particle. Kalman filter. Graphbased. filter. Kalman. Particle. filter. filter. Three Main SLAM Paradigms. Robot Mapping

Planning in autonomous mobile robotics

Sensor Data Fusion Using Kalman Filter

Development of a Low-Cost SLAM Radar for Applications in Robotics

Intelligent Vehicle Localization Using GPS, Compass, and Machine Vision

Autonomous Localization

Robots Leaving the Production Halls Opportunities and Challenges

ANASTASIOS I. MOURIKIS CURRICULUM VITAE

COS Lecture 7 Autonomous Robot Navigation

Ultrawideband Radar Processing Using Channel Information from Communication Hardware. Literature Review. Bryan Westcott

GPS data correction using encoders and INS sensors

Large Scale Experimental Design for Decentralized SLAM

The Autonomous Robots Lab. Kostas Alexis

ACOOPERATIVE multirobot system is beneficial in many

Multi Robot Localization assisted by Teammate Robots and Dynamic Objects

Durham E-Theses. Development of Collaborative SLAM Algorithm for Team of Robots XU, WENBO

Exploration of Unknown Environments Using a Compass, Topological Map and Neural Network

Cooperative Localization by Factor Composition over a Faulty Low-Bandwidth Communication Channel

Cooperative Localization and Mapping in Sparsely-Communicating Robot Networks. Keith Yu Kit Leung

Integration of GNSS and INS

INDOOR HEADING MEASUREMENT SYSTEM

Global Variable Team Description Paper RoboCup 2018 Rescue Virtual Robot League

Vision-based Localization and Mapping with Heterogeneous Teams of Ground and Micro Flying Robots

Sensing and Perception: Localization and positioning. by Isaac Skog

Robot Mapping. Summary on the Kalman Filter & Friends: KF, EKF, UKF, EIF, SEIF. Gian Diego Tipaldi, Wolfram Burgard

FSR99, International Conference on Field and Service Robotics 1999 (to appear) 1. Andrew Howard and Les Kitchen

Towards Autonomous Planetary Exploration Collaborative Multi-Robot Localization and Mapping in GPS-denied Environments

High Speed vslam Using System-on-Chip Based Vision. Jörgen Lidholm Mälardalen University Västerås, Sweden

7. Referencias y Bibliografía

NTU Robot PAL 2009 Team Report

Range Sensing strategies

COOPERATIVE SELF-LOCALIZATION IN A MULTI-ROBOT-NO- LANDMARK SCENARIO USING FUZZY LOGIC. A Thesis DHIRENDRA KUMAR SINHA

THE control of robot formation is an important topic

Robotics Enabling Autonomy in Challenging Environments

Autonomous Underwater Vehicle Navigation.

Resilient and Accurate Autonomous Vehicle Navigation via Signals of Opportunity

Jeffrey M. Walls and Ryan M. Eustice

International Journal of Informative & Futuristic Research ISSN (Online):

Path Planning in Dynamic Environments Using Time Warps. S. Farzan and G. N. DeSouza

Cooperative AUV Navigation using MOOS: MLBL Maurice Fallon and John Leonard

Collaborative Multi-Robot Exploration

Summary of robot visual servo system

Cooperative Tracking with Mobile Robots and Networked Embedded Sensors

Walking and Flying Robots for Challenging Environments

Autonomous and Mobile Robotics Prof. Giuseppe Oriolo. Introduction: Applications, Problems, Architectures

Sample PDFs showing 20, 30, and 50 ft measurements 50. count. true range (ft) Means from the range PDFs. true range (ft)

Cubature Kalman Filtering: Theory & Applications

Multi-Robot Range-Only SLAM by Active Sensor Nodes for Urban Search and Rescue

Brainstorm. In addition to cameras / Kinect, what other kinds of sensors would be useful?

Communication-Aware Motion Planning in Fading Environments

AGH University of Science and Technology Computer Science Laboratory Department of Automatics Al. Mickiewicza Kraków, POLAND

Mobile Robots Exploration and Mapping in 2D

High Precision 6DOF Vehicle Navigation in Urban Environments using a Low-cost Single-frequency GPS Receiver

2006 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media,

Multi-robot Dynamic Coverage of a Planar Bounded Environment

Hybrid Positioning through Extended Kalman Filter with Inertial Data Fusion

APPLICATION OF FUZZY BEHAVIOR COORDINATION AND Q LEARNING IN ROBOT NAVIGATION

Coordination for Multi-Robot Exploration and Mapping

Deploying Artificial Landmarks to Foster Data Association in Simultaneous Localization and Mapping

Range-Only SLAM for Robots Operating Cooperatively with Sensor Networks

Range-Only SLAM for Robots Operating Cooperatively with Sensor Networks

12th International Conference on Information Fusion Seattle, WA, USA, July 6-9, ISIF 126

A MULTI-ROBOT, COOPERATIVE, AND ACTIVE SLAM ALGORITHM FOR EXPLORATION. Viet-Cuong Pham and Jyh-Ching Juang. Received March 2012; revised August 2012

Passive Mobile Robot Localization within a Fixed Beacon Field. Carrick Detweiler

Sponsored by. Nisarg Kothari Carnegie Mellon University April 26, 2011

MULTI-LAYERED HYBRID ARCHITECTURE TO SOLVE COMPLEX TASKS OF AN AUTONOMOUS MOBILE ROBOT

4D-Particle filter localization for a simulated UAV

The Future of AI A Robotics Perspective

Autonomous Mobile Robot Design. Dr. Kostas Alexis (CSE)

Efficient Learning in Cellular Simultaneous Recurrent Neural Networks - The Case of Maze Navigation Problem

Mobile robot swarming using radio signal strength measurements and dead-reckoning

Distributed Vision System: A Perceptual Information Infrastructure for Robot Navigation

Distributed Collaborative Path Planning in Sensor Networks with Multiple Mobile Sensor Nodes

Shoichi MAEYAMA Akihisa OHYA and Shin'ichi YUTA. University of Tsukuba. Tsukuba, Ibaraki, 305 JAPAN

Mobile Target Tracking Using Radio Sensor Network

A Kalman Filter Localization Method for Mobile Robots

COOPERATIVE UAS LOCALIZATION USING LOW COST SENSORS

Spatial Navigation Algorithms for Autonomous Robotics

Mobile Target Tracking Using Radio Sensor Network

Cooperative Tracking using Mobile Robots and Environment-Embedded, Networked Sensors

Motion State Estimation for an Autonomous Vehicle- Trailer System Using Kalman Filtering-based Multisensor Data Fusion

RECONFIGURABLE SLAM UTILISING FUZZY REASONING

Structure and Synthesis of Robot Motion

Carrier Phase GPS Augmentation Using Laser Scanners and Using Low Earth Orbiting Satellites

Perception-Driven Navigation: Active Visual SLAM for Robotic Area Coverage

Autonomous Vehicle Reliability and Localization

Transcription:

Localisation et navigation de robots UPJV, Département EEA M2 EEAII, parcours ViRob Année Universitaire 2017/2018 Fabio MORBIDI Laboratoire MIS Équipe Perception ique E-mail: fabio.morbidi@u-picardie.fr Mercredi 10h00-12h30 et jeudi 9h30-12h00 Salles 304 et TP204

2 Multi-robot localization (research overview)

3 General problem formulation Multi-robot localization: estimation of the pose (the position and orientation, ) of a team of mobile robots with respect to a common reference frame, using the proprioceptive and exteroceptive measurements Beacon or landmark q i =[x i,y i,θ i ] T 2 1

4 What a robot sees... The circled robot is equipped with a laser rangefinder perception of the surrounding environement: shaded gray Beacon Teammate

5 Cooperative positioning Idea: use the other robots in the team as moving beacons The robots are divided into two groups, A and B, and their positions are tracked by repeating move-and-stop actions: 1. Group A remains stationary at a known position. Move group B and make it position itself relative to group A using information from the proprioceptive sensors 2. Stop group B after it has traveled an appropriate distance, and accurately measure its position relative to the group-a robots 3. Exchange roles of groups A and B and repeat the steps above 4. Repeat this process until they reach the target positions A A A B B B Cooperative positioning with multiple robots, R. Kurazume, S. Nagata, S. Hirose, in Proc. IEEE Int. Conf. ics and Automation, vol. 2, pp. 1250-1257, 1994

6 Cooperative localization Cooperative localization: the problem of estimating the pose of a group of robots in a common fixed frame using relative measurements among the robots In general, the ability of sensing each other improves the localization of the entire system obtained by simple odometry The fusion of proprioceptive and exteroceptive sensor information is usually performed using the EKF or a particle filter Measurement q i =[x i,y i,θ i ] T q j =[x j,y j,θ j ] Fixed frame T

7 Cooperative localization: measurement equation Let us assume that at step k, robot observes robot using its exteroceptive sensor. The measurement equation is then: r ij z ij (k) =h(q i (k), q j (k)) + r ij (k), where is a zero-mean white Gaussian noise with covariance matrix To implement the correction step of the EKF for solving the cooperative localization problem, we need to compute the two Jacobians (row vectors): H i = h(q i, q j ) q i, H j = h(q i, q j ) q j

8 Cooperative localization: measurement equation In 2D, there are three possibile types of relative measurements: 1. Relative bearing 2. Relative distance 3. Relative orientation Other types of measurements can be taken into account as combinations of these three (e.g. relative position = relative bearing + relative distance) Relative bearing Relative distance Relative orientation

9 Cooperative localization: measurement equation Let and. Relative bearing : Relative distance : Relative orientation (linear equation) :

10 Cooperative localization: observability If no robot has absolute localization capabilities, the multi-robot system is not observable, i.e. the error will increase indefinitely and the estimate of the poses in will eventually diverge However, even if the team is lost in, the error on the relative poses of the robots q i q j ( q i q j ), i, j, will converge to zero If at least one robot has absolute localization capabilities (e.g., because it has a GPS or is able to detect a beacon of known position), the multi-robot system becomes observable, and the pose estimation error converges to zero This happens since one robot is able to estimate its pose in and the other robots are able to localize themselves with respect to it For further details, see: Observability Analysis for Mobile Localization, A. Martinelli, R. Siegwart, in Proc. IEEE/RSJ Int. Conf. Intelligent s and Systems, pp. 1471-1476, 2005 Distributed Multirobot Localization, S.I. Roumeliotis, G.A. Bekey, IEEE Trans. ics and Automation, vol. 18, n. 5, pp. 781-795, 2002

11 Cooperative localization: observability No robot with absolute localization capabilities: Estimated poses Actual poses with absolute localization capability: GPS The estimated and actual poses are close to each other

12 Relative Mutual Localization How can we avoid the observability problem of Cooperative Localization We can provide each robot with a reference frame with respect to which it cannot get lost Let us define an attached moving frame for robot Relative Mutual Localization (RML): the problem of estimating the relative poses between the moving frames Each robot computes an estimate of the pose of the teammates in its reference frame. Each robot considers itself always in This approach is also called robo-centric or ego-centric For more details, see: -to-robot relative pose estimation from range measurements", X.S. Zhou, S.I. Roumeliotis, IEEE Trans. ics, vol. 24, n. 6, pp. 1379-1393, 2008

13 Relative Mutual Localization Measurement Measurement

14 Absolute Mutual Localization has: A fixed frame An attached moving frame Absolute Mutual Localization (AML): the problem of estimating the relative pose between the fixed frames using the relative measurements among the robots Applications: Map merging Cooperative exploration Known by robot Measurement Known by robot AML is solved if RML is solved and the agents are localized in their fixed frames

15 Multi-robot localization: problem summary Cooperative Positioning: two groups of robots are alternatively used as moving beacons Cooperative Localization: the robots estimate their pose in a common fixed frame using relative measurements Relative Mutual Localization (RML): the robots estimate the change of coordinates among their attached frames using relative measurements (localization of sensor networks: special case of RML in which the agents or robots are static) Absolute Mutual Localization (AML): the robots estimate the relative poses between their fixed frames using relative measurements

16 Multi-robot SLAM (research overview)

17 SLAM problem The SLAM problem asks if it is possible for a mobile robot to be placed at an unknown location in an unknown environment, and for the robot to incrementally build a consistent map of this environment while simultaneously determining its location within this map See video SLAM1

18 SLAM problem Critical issue in SLAM Loop-closure Gap Map built End Start Gap For more details on SLAM, see the survey paper: Simultaneous localization and mapping: part I, H.H. Durrant-Whyte, T. Bailey, in IEEE. Autom. Mag., vol. 13, n. 2, pp. 99-110, 2006 See video SLAM2

19 Cooperative SLAM (C-SLAM) In a simple 2-D C-SLAM problem, a team mobile robots move continuously and randomly in a planar environment, while recording measurements of the relative positions (distance and bearing) of other robots in the team and of point beacons detected in the environment Beacon 1 Beacon 2 Beacon 3 1 Measurement 2 Measurement Estimating uncertain spatial relationships in robotics, R.C. Smith, M. Self, P. Cheeseman, in Autonomous Vehicles (Eds. I. Cox, G. Wilfong), Springer, pp. 167-193, 1990

20 C-SLAM (cont d) The robots use proprioceptive measurements to propagate their position estimates, and are equipped with exteroceptive sensors (e.g. laser rangefinders) that enable them to measure the relative position of other robots and beacons An extended Kalman filter is often used to fuse the measurements together, and produce estimates of the position of the robots and beacons Beacon 1 Beacon 2 Beacon 3 1 2 See video C-SLAM

21 C-SLAM: some variations on the theme C-SLAM using a set-membership approach: Simultaneous localization and map building for a team of cooperating robots: a set membership approach, M. Di Marco, A. Garulli, A. Giannitrapani, A. Vicino, IEEE Trans. ics and Automation, vol. 19, n. 2, pp. 238-249, 2003 C-SLAM using particle filters: Multi-robot simultaneous localization and mapping using particle filters, A. Howard, Int. J. ics Research, vol. 25, n. 12, pp. 1243-1256, 2006 Derivation of analytical bounds for the positioning uncertainty in C-SLAM (in terms of number of beacons and robots, accuracy of robots sensors and topology of the Relative Position Measurement Graph) Predicting the performance of cooperative simultaneous localization and mapping (C-SLAM), A.I. Mourikis, S.I. Roumeliotis, Int. J. ics Research, vol. 25, n. 12, pp. 1273-1286, 2006