Introduction to Embedded and Real-Time Systems W12: An Introduction to Localization Techniques in Embedded Systems

Similar documents
Signals, Instruments, and Systems W10 An Introduction to Localization and Power Management in Embedded Systems

Brainstorm. In addition to cameras / Kinect, what other kinds of sensors would be useful?

Range Sensing strategies

Signals, Instruments, and Systems W11. Localization Techniques & Traditional and Advanced Instruments for Environmental Monitoring

Estimation of Absolute Positioning of mobile robot using U-SAT

Intelligent Robotics Sensors and Actuators

Localization in Wireless Sensor Networks

10/21/2009. d R. d L. r L d B L08. POSE ESTIMATION, MOTORS. EECS 498-6: Autonomous Robotics Laboratory. Midterm 1. Mean: 53.9/67 Stddev: 7.

LOCALIZATION WITH GPS UNAVAILABLE

The Cricket Indoor Location System

Agenda Motivation Systems and Sensors Algorithms Implementation Conclusion & Outlook

Signals, Instruments, and Systems W7. Embedded Systems General Concepts and

COS Lecture 7 Autonomous Robot Navigation

Real-time Math Function of DL850 ScopeCorder

Cooperative navigation: outline

Mobile Target Tracking Using Radio Sensor Network

Mobile Positioning in Wireless Mobile Networks

Lab 2. Logistics & Travel. Installing all the packages. Makeup class Recorded class Class time to work on lab Remote class

EEE 187: Robotics. Summary 11: Sensors used in Robotics

NAVIGATION OF MOBILE ROBOTS

Swarm Intelligence W7: Application of Machine- Learning Techniques to Automatic Control Design and Optimization

MOBILE COMPUTING 1/28/18. Location, Location, Location. Overview. CSE 40814/60814 Spring 2018

Indoor Positioning by the Fusion of Wireless Metrics and Sensors

Sensor Data Fusion Using Kalman Filter

Introduction to Embedded and Real-Time Systems W10: Hardware Design Choices and Basic Control Architectures for Mobile Robots

GPS data correction using encoders and INS sensors

Lecture Notes Prepared by Prof. J. Francis Spring Remote Sensing Instruments

Chapter 2 Sensors. The Author(s) 2018 M. Ben-Ari and F. Mondada, Elements of Robotics, / _2

Transponder Based Ranging

Development of intelligent systems

Lecture: Sensors , Fall 2008

Localization in WSN. Marco Avvenuti. University of Pisa. Pervasive Computing & Networking Lab. (PerLab) Dept. of Information Engineering

Sensors and Actuators

Global Navigation Satellite Systems (GNSS)Part I EE 570: Location and Navigation

Tektronix AFG10022 Function Generator. Coming soon to B10: Sin, Square, Ramp, Swept, Arbitrary, Noise. Linear Actuators. Non-magnetized iron plunger

Design Project Introduction DE2-based SecurityBot

Case sharing of the use of RF Localization Techniques. Dr. Frank Tong LSCM R&D Centre LSCM Summit 2015

MEM380 Applied Autonomous Robots I Fall Introduction to Sensors & Perception

Probabilistic Robotics Course. Robots and Sensors Orazio

Motion Control of a Three Active Wheeled Mobile Robot and Collision-Free Human Following Navigation in Outdoor Environment

MEM380 Applied Autonomous Robots I Winter Feedback Control USARSim

Final Report for AOARD Grant FA Indoor Localization and Positioning through Signal of Opportunities. Date: 14 th June 2013

Development of Multiple Sensor Fusion Experiments for Mechatronics Education

A Differential Steering Control with Proportional Controller for An Autonomous Mobile Robot

EE 570: Location and Navigation

Mobile Robots (Wheeled) (Take class notes)

Sensors. human sensing. basic sensory. advanced sensory. 5+N senses <link> tactile touchless (distant) virtual. e.g. camera, radar / lidar, MS Kinect

Mobile Radio Propagation Channel Models

INDOOR LOCATION SENSING AMBIENT MAGNETIC FIELD. Jaewoo Chung

Measurement report. Laser total station campaign in KTH R1 for Ubisense system accuracy evaluation.

The Global Positioning System

Wheeled Mobile Robot Obstacle Avoidance Using Compass and Ultrasonic

Roadside Range Sensors for Intersection Decision Support

Distributed Robotics From Science to Systems

Distributed Virtual Environments!

Relative Navigation, Timing & Data. Communications for CubeSat Clusters. Nestor Voronka, Tyrel Newton

INDOOR HEADING MEASUREMENT SYSTEM

GPS Milestones, cont. GPS Milestones. The Global Positioning Sytem, Part 1 10/10/2017. M. Helper, GEO 327G/386G, UT Austin 1. US GPS Facts of Note

Prof. Emil M. Petriu 17 January 2005 CEG 4392 Computer Systems Design Project (Winter 2005)

RECOMMENDATION ITU-R S *

Service Robots Assisting Human: Designing, Prototyping and Experimental Validation

Satellite Sub-systems

A CubeSat-Based Optical Communication Network for Low Earth Orbit

PRESENTED BY HUMANOID IIT KANPUR

Wireless Localization Techniques CS441

Ultrasound-Based Indoor Robot Localization Using Ambient Temperature Compensation

Worst-Case GPS Constellation for Testing Navigation at Geosynchronous Orbit for GOES-R

Behaviour Patterns Evolution on Individual and Group Level. Stanislav Slušný, Roman Neruda, Petra Vidnerová. CIMMACS 07, December 14, Tenerife

Robotics Intelligent sensors (part 1)

Sponsored by. Nisarg Kothari Carnegie Mellon University April 26, 2011

Semi-Autonomous Parking for Enhanced Safety and Efficiency

Localization. of mobile devices. Seminar: Mobile Computing. IFW C42 Tuesday, 29th May 2001 Roger Zimmermann

WLAN Location Methods

ECE 445 Spring 2017 Autonomous Trash Can. Group #85: Eshwar Cheekati, Michael Gao, Aditya Sule

Multi-robot Formation Control Based on Leader-follower Method

Solar Powered Obstacle Avoiding Robot

UWB for Lunar Surface Tracking. Richard J. Barton ERC, Inc. NASA JSC

A MULTI-SENSOR FUSION FOR INDOOR-OUTDOOR LOCALIZATION USING A PARTICLE FILTER

INDOOR LOCATION SENSING USING GEO-MAGNETISM

Sensors and Sensing Motors, Encoders and Motor Control

Fast radio interferometric measurement on low power COTS radio chips A. Bata, A. Bíró, Gy. Kalmár and M. Maróti University of Szeged, Hungary

MOBILE ROBOTICS. Sensors An Introduction

Tracking and Formation Control of Leader-Follower Cooperative Mobile Robots Based on Trilateration Data

An Example of robots with their sensors

IOT GEOLOCATION NEW TECHNICAL AND ECONOMICAL OPPORTUNITIES

Mobile Target Tracking Using Radio Sensor Network

Cooperative localization (part I) Jouni Rantakokko

An Example of robots with their sensors

* Intelli Robotic Wheel Chair for Specialty Operations & Physically Challenged

Robot Hardware Non-visual Sensors. Ioannis Rekleitis

Carrier Phase GPS Augmentation Using Laser Scanners and Using Low Earth Orbiting Satellites

Smart off axis absolute position sensor solution and UTAF piezo motor enable closed loop control of a miniaturized Risley prism pair

Ranging detection algorithm for indoor UWB channels and research activities relating to a UWB-RFID localization system

3D ULTRASONIC STICK FOR BLIND

Feedback Devices. By John Mazurkiewicz. Baldor Electric

Sensing. Autonomous systems. Properties. Classification. Key requirement of autonomous systems. An AS should be connected to the outside world.

An Automated Rice Transplanter with RTKGPS and FOG

Vision-based Localization and Mapping with Heterogeneous Teams of Ground and Micro Flying Robots

AN0503 Using swarm bee LE for Collision Avoidance Systems (CAS)

ACTUATORS AND SENSORS. Joint actuating system. Servomotors. Sensors

Transcription:

Introduction to Embedded and Real-Time Systems W12: An Introduction to Localization Techniques in Embedded Systems

Outline Motivation Terminology and classification Selected positioning systems and techniques Odometry Inverse kinematic model Position reconstruction

Motivation

Motivating Application: Sensor Networks Goal: Develop a self-calibrating system to support collaborative acoustic sensing applications, such as beam-forming and cross-beam localization. Target System: Input: Node placement: 3D, Outdoor, Foliage OK 20m Inter-node spacing Arrays are level Output: Estimates: XYZ Position ± 25cm Orientation ± 2 70x50m [From L. Girod and D. Estrin, UCLA-CENS] Results in James Reserve Accurate: Mean 3D Position Error: 20 cm Precise: Std. Dev. of Node Position: 18 cm

Motivating Application: Distributed Coverage and Inspection Case study: turbine inspection 40 Alice II robots Vision Zigbee-compliant com 2 Scenarios No localization Global localization Comparing coordination schemes: Reactive vs. Deliberative Collaborative vs. Non- Collaborative 2 x 2 x 2 cm [Correll & Martinoli, IEEE RAM, 2009]

Motivating Application: Distributed Coverage and Inspection Localization uniquely based on odometry and crude recognition of blade features with proximity sensors Localization based on a unique color ID for each blade and odometry

Terminology and Classification

Classification axes Indoor vs. outdoor techniques Absolute vs. relative positioning systems Line-of-sight vs. obstacle passing/surrounding Underlying physical principle and channel Positioning available on-board vs. off-board Scalability in terms of number of nodes

Performance of Positioning Systems As any another sensor, position sensor accuracy, precision, range, positioning update frequency σ = standard dev of the sensor noise [From Introduction to Autonomous Mobile Robots, Siegwart R. and Nourbakhsh I. R.]

Indoor Positioning Systems

Selected Indoor Positioning Systems Laser-based indoor GPS Ultrasound (US) + radio frequency (RF) technology Infrared (IR) + RF technology Vision-based overhead system Impulse Radio Ultra Wide Band (IR-UWB)

Laser-Based Indoor (KPS) Performance: a few mm in position over 5x5 m arena, 25-50 Hz, a few degrees in orientation Position available on the robot without com (GPS-like) Line-of-sight method Tested in 2D but extensible in 3D (2 laser base stations)

Ultrasound + Radio Technology Principle: time of arrival on 3 (2D) or 4 (3D) US receptors, synchronization with radio signal Used for relative (on the robots) and absolute positioning (fixed beacons) Accuracy: sub cm accuracy over several m for a 30 cm radius platform (e.g. Michaud et al, ICRA 2008) Accuracy inversely proportional with size of the module (proportional to distance between US receptors) Updating speed: 1/(0.075*N_robots) Hz (e.g., < 1 Hz with 14 or more robots) (Michaud et al, ICRA 2008)

Ultrasound + Radio Technology [From Introduction to Autonomous Mobile Robots, Siegwart R. and Nourbakhsh I. R.]

Infrared + Radio Technology Principle: belt of IR emitters (LED) and receivers (photodiode) IR LED used as antennas; modulated light (carrier 10.7 MHz) RF chip behind, measured RSSI Measure range & bearing of the next robot; can be coupled with RF channel (e.g. 802.11) for heading assessment Can also be used for 20 kbit/s com channel [Pugh et al., IEEE Trans. on Mechatronics, 2009]

Infrared + Radio Technology Range response Bearing calculation

Infrared + Radio Technology Performance summary: Range: 3.5 m Update frequency 25 Hz with 10 neighboring robots (or 250 Hz with 2) Accuracy range: <7% (MAX) Accuracy bearing: < 9º (RMS) Possible extension in 3D, larger range (but more power) and better bearing accuracy with more photodiodes (e.g. Bergbreiter, PhD UCB 2008, dedicated asic, up to 15 m, 256 photodiodes, single emitter with conic lense)

SwisTrack Tracking objects with one (or more) overhead cameras Absolute positions, available outside the robot/sensor Active, passive, or no markers Open source software Major issues: light, calibration Price per camera: CHF 100.- to 10000.- Accuracy ~ 1 cm (2D) Update rate ~ 20 Hz # agents ~ 100 Area ~ 10 m 2

IR-UWB Ubisense Tracking UWB tags Absolute positions, available outside the robot/sensor Multiple antennas Battery for 5 years 6-8 GHz UWB channel Price: EUR 13000.- base system EUR 70.- / additional tag Accuracy 15 cm (3D) Update rate 40 Hz / tag # agents ~ 10000 Area ~ 1000 m 2

Outdoor Positioning Systems

Selected Outdoor Positioning Techniques GPS Differential GPS (dgps) US/Sound + RF

Global Positioning System

Global Positioning System 24 satellites (including three spares) orbiting the earth every 12 hours at a height of 20.190 km. Location of any GPS receiver is determined through a time of flight measurement Real time update of the exact location of the satellites: - monitoring the satellites from a number of widely distributed ground stations - master station analyses all the measurements and transmits the actual position to each of the satellites Exact measurement of the time of flight the receiver correlates a pseudocode with the same code coming from the satellite The delay time for best correlation represents the time of flight. quartz clock on the GPS receivers are not very precise the range measurement with four satellite allows to identify the three values (x, y, z) for the position and the clock correction T Recent commercial GPS receiver devices allows position accuracies down to a couple meters.

dgps Position accuracy: typically from a few to a few tens of cm

Distributed Acoustic Sensing Application Requirements Cross-beam localization requires 3D Array Position Level Orientation Design constraints 3D terrain, 3D target locations Spacing requirement: 20+ meters Accuracy requirement 2 bearing, ±25 cm position Resilient to environment Ground foliage Background noise Weather conditions [From L. Girod and D. Estrin, UCLA-CENS]

Problem Statement Goal: Develop a self-calibrating system to support collaborative acoustic sensing applications, such as beam-forming and cross-beam localization. Target System: Input: Node placement: 3D, Outdoor, Foliage OK 20m Inter-node spacing Arrays are level Output: Estimates: XYZ Position ± 25cm Orientation ± 2 70x50m [From L. Girod and D. Estrin, UCLA-CENS] Results in James Reserve Accurate: Mean 3D Position Error: 20 cm Precise: Std. Dev. of Node Position: 18 cm

Position Estimation Application Emit Coded Signal Time-Synchronized Sampling Service Layer Select Code Detection Algorithm Detection Algorithm Ranging Layer Trigger <Code, Detect Time> Time Sync Control Traffic <Range, θ, φ> <Code, Detect Time> Multi-hop Network Layer Trigger <X, Y, Z, θ> <Range, θ, φ> Multilateration Layer [From L. Girod and D. Estrin, UCLA-CENS]

Acoustic Array Configuration 4 condenser microphones, arranged in a square with one raised 4 piezo tweeter emitters pointing outwards Array mounts on a tripod or stake, wired to CPU box Coordinate system defines angles relative to array (-4,4,14) 14cm 0 θ θ φ 0 φ 90 θ 8cm (-4,-4,0) [From L. Girod and D. Estrin, UCLA-CENS]

Experiments Component Testing Azimuth angle test Zenith angle test Range test Cement Wall 24 feet System Testing Court of Sciences Test James Reserve Test [From L. Girod and D. Estrin, UCLA-CENS]

Odometry

Motivating Example from two Weeks Ago: Behavior-Based Control with Motor Schemas Detect-obstacles Avoid-obstacle sensors Detect-Goal Move-to-Goal Σ actuators

Visualization of Vector Field for Ex. 5 Avoid-obstacle V magnitude = 0 S d G S R for for for S = obstacle s sphere of influence R = radius of the obstacle G = gain D = distance robot to obstacle s center V direction = radially along a line between robot and obst. center, directed away from the obstacle R d < d > d S R S Obstacle Obstacle

Visualization of Vector Field for Ex. 5 Move-to-goal Output = vector = (r,φ) (magnitude, direction) V magnitude = fixed gain value Goal V direction = towards perceived goal

Implementation of Move-to-Goal as Motivation: For moving from location A to location B (e.g. from a starting point to the goal location) do we need a GPS-like signal or a globally defined and perceivable field? No! We can reconstruct the global pose of a vehicle (position and orientation) using onboard, proprioceptive movement sensors For indoor and outdoor environments

Not only Robots Example: Cataglyphis desert ant Excellent study by Prof. R. Wehner (University of Zuerich, Emeritus) Individual foraging strategy Underlying mechanisms Internal compass (polarization of sun light) Dead-reckoning (path integration on neural chains for leg control) Local search (around 1-2 m from the nest) Extremely accurate navigation: averaged error of a few tens of cm over 500 m path!

Odometry: Principles Q: can we track the absolute position (global/environmental reference frame) based on movement information exclusively measured by on-board proprioceptive information? A: yes, using odometry! (and knowledge of initial position) Needed: DC motors + encoders (closed-loop control) or stepper motors (open-loop, but preestablished fixed increment per pulse, as on e- puck)

Optical Encoders Measure position (or speed) of the wheels Principle: mechanical light chopper consisting of photo-barriers (pair of light emitter and optical receiver) + pattern on a disc anchored to the motor shaft Quadrature encoder: 90º placement of 2 complete photo-barriers, 4x increase resolution + direction of movement Integrate wheel movements to get an estimate of the position -> odometry Typical resolutions: 64-2048 increments per revolution. For high resolution: interpolation

Position and Orientation of a Differential-Drive Robot y I Y I Y R y R P θ X R ξ I xi xr ξ R y = = R = R( θ ) ξ yi I θ θ cosθ sinθ 0 R( θ ) = sinθ cosθ 0 0 0 1 x I x R X I From Introduction to Autonomous Mobile Robots, Siegwart R. and Nourbakhsh I. R.

Absolute and Relative Velocity of a Differential-Drive Robot = = I I I I I I R R R x y y x y x θ θ θ & & & & & & & & & 1 0 0 0 0 1 0 1 0 X I Y I P X R Y R θ Ex. θ =π/2 R R ξ I θ ξ & & ) ( =

Y I Forward Kinematic Model How does the robot move given the wheel speeds and geometry? Assumption: no wheel slip! In miniature robots no major dynamic effects due to low mass l ω(t) 2 1 θ v(t) & ξ r (wheel radius) x& I y& I & θ I = = ϕ& i f l, r, θ, & ϕ 1, & ϕ ) ( 2 = wheel i speed X I

Forward Kinematic Model Linear velocity = average wheel speed 1 and 2: r & ϕ v = + 2 r & 2 1 ϕ 2 Rotational velocity = sum of rotation speeds (wheel 1 clockwise, wheel 2 counterclockwise): ω = r & ϕ 1 r & ϕ + 2 2l 2l Y I l Y R ω 2 P 1 r θ v X R Note: linear velocity equals rotational velocity times radius X I

+ + = l r l r r r I 2 2 0 2 2 1 0 0 0 cos sin 0 sin cos 2 1 2 1 ϕ ϕ ϕ ϕ θ θ θ θ ξ & & & & & I R ξ R θ ξ & & ) ( 1 = = 0 y& R 2 2 1 ϕ 2 ϕ & & & r r v x R + = = l r l r R 2 2 ϕ 1 ϕ 2 ω θ & & & + = = Forward Kinematic Model Y I X I v θ ω l r 1 X R Y R P 2 1. 2. 3. 4.

Odometry Q: given our absolute velocity, how can we calculate the robot position after some time t? A: integrate! Given the kinematic forward model, and assuming no slip on both wheels T T I I I I 0 0 1 ( T ) = + & dt = + R ( ) & ξ ξ ξ ξ θ ξ dt 0 0 Given an initial position and orientation ξ I0, after time T, the position and orientation of the vehicle will be ξ I (T) ξ I (T) computable with wheel speed 1, wheel speed 2, and parameters r and l Note: in practice wheel slippage always present positional error based on odometry is cumulative and incrementally increases R

Examples of Odometric Error Evolution for a Differential-Wheel Vehicle Notes: Gaussian assumption; ellipses shows 3σ bound Error on a straight line grows faster in a direction orthogonal to the trajectory Main axis of error ellipses not always perpendicular to the trajectory [From Introduction to Autonomous Mobile Robots, Siegwart R. and Nourbakhsh I. R.]

Conclusion

Take Home Messages A number of techniques for node localization have been developed Differences between indoor and outdoor techniques (range, infrastructure, accuracy) Some of them require external infrastructure (e.g. creation of an artificial field), some of them are based on complete on-board equipment (e.g. odometry) Different errors and costs characterized the different positioning and localization systems

Additional Literature Week 12 Books Siegwart R. and Nourbakhsh I. R., Introduction to Autonomous Mobile Robots, MIT Press, 2004. Borenstein J., Everett H. R., and Feng L. Navigating Mobile Robots: Systems and Techniques, A. K. Peters, Ltd., 1996. Everett H.R., Sensors for Mobile Robots, Theory and Applications, A. K. Peters, Ltd., 1995.