Introduction to Embedded and Real-Time Systems W12: An Introduction to Localization Techniques in Embedded Systems
Outline Motivation Terminology and classification Selected positioning systems and techniques Odometry Inverse kinematic model Position reconstruction
Motivation
Motivating Application: Sensor Networks Goal: Develop a self-calibrating system to support collaborative acoustic sensing applications, such as beam-forming and cross-beam localization. Target System: Input: Node placement: 3D, Outdoor, Foliage OK 20m Inter-node spacing Arrays are level Output: Estimates: XYZ Position ± 25cm Orientation ± 2 70x50m [From L. Girod and D. Estrin, UCLA-CENS] Results in James Reserve Accurate: Mean 3D Position Error: 20 cm Precise: Std. Dev. of Node Position: 18 cm
Motivating Application: Distributed Coverage and Inspection Case study: turbine inspection 40 Alice II robots Vision Zigbee-compliant com 2 Scenarios No localization Global localization Comparing coordination schemes: Reactive vs. Deliberative Collaborative vs. Non- Collaborative 2 x 2 x 2 cm [Correll & Martinoli, IEEE RAM, 2009]
Motivating Application: Distributed Coverage and Inspection Localization uniquely based on odometry and crude recognition of blade features with proximity sensors Localization based on a unique color ID for each blade and odometry
Terminology and Classification
Classification axes Indoor vs. outdoor techniques Absolute vs. relative positioning systems Line-of-sight vs. obstacle passing/surrounding Underlying physical principle and channel Positioning available on-board vs. off-board Scalability in terms of number of nodes
Performance of Positioning Systems As any another sensor, position sensor accuracy, precision, range, positioning update frequency σ = standard dev of the sensor noise [From Introduction to Autonomous Mobile Robots, Siegwart R. and Nourbakhsh I. R.]
Indoor Positioning Systems
Selected Indoor Positioning Systems Laser-based indoor GPS Ultrasound (US) + radio frequency (RF) technology Infrared (IR) + RF technology Vision-based overhead system Impulse Radio Ultra Wide Band (IR-UWB)
Laser-Based Indoor (KPS) Performance: a few mm in position over 5x5 m arena, 25-50 Hz, a few degrees in orientation Position available on the robot without com (GPS-like) Line-of-sight method Tested in 2D but extensible in 3D (2 laser base stations)
Ultrasound + Radio Technology Principle: time of arrival on 3 (2D) or 4 (3D) US receptors, synchronization with radio signal Used for relative (on the robots) and absolute positioning (fixed beacons) Accuracy: sub cm accuracy over several m for a 30 cm radius platform (e.g. Michaud et al, ICRA 2008) Accuracy inversely proportional with size of the module (proportional to distance between US receptors) Updating speed: 1/(0.075*N_robots) Hz (e.g., < 1 Hz with 14 or more robots) (Michaud et al, ICRA 2008)
Ultrasound + Radio Technology [From Introduction to Autonomous Mobile Robots, Siegwart R. and Nourbakhsh I. R.]
Infrared + Radio Technology Principle: belt of IR emitters (LED) and receivers (photodiode) IR LED used as antennas; modulated light (carrier 10.7 MHz) RF chip behind, measured RSSI Measure range & bearing of the next robot; can be coupled with RF channel (e.g. 802.11) for heading assessment Can also be used for 20 kbit/s com channel [Pugh et al., IEEE Trans. on Mechatronics, 2009]
Infrared + Radio Technology Range response Bearing calculation
Infrared + Radio Technology Performance summary: Range: 3.5 m Update frequency 25 Hz with 10 neighboring robots (or 250 Hz with 2) Accuracy range: <7% (MAX) Accuracy bearing: < 9º (RMS) Possible extension in 3D, larger range (but more power) and better bearing accuracy with more photodiodes (e.g. Bergbreiter, PhD UCB 2008, dedicated asic, up to 15 m, 256 photodiodes, single emitter with conic lense)
SwisTrack Tracking objects with one (or more) overhead cameras Absolute positions, available outside the robot/sensor Active, passive, or no markers Open source software Major issues: light, calibration Price per camera: CHF 100.- to 10000.- Accuracy ~ 1 cm (2D) Update rate ~ 20 Hz # agents ~ 100 Area ~ 10 m 2
IR-UWB Ubisense Tracking UWB tags Absolute positions, available outside the robot/sensor Multiple antennas Battery for 5 years 6-8 GHz UWB channel Price: EUR 13000.- base system EUR 70.- / additional tag Accuracy 15 cm (3D) Update rate 40 Hz / tag # agents ~ 10000 Area ~ 1000 m 2
Outdoor Positioning Systems
Selected Outdoor Positioning Techniques GPS Differential GPS (dgps) US/Sound + RF
Global Positioning System
Global Positioning System 24 satellites (including three spares) orbiting the earth every 12 hours at a height of 20.190 km. Location of any GPS receiver is determined through a time of flight measurement Real time update of the exact location of the satellites: - monitoring the satellites from a number of widely distributed ground stations - master station analyses all the measurements and transmits the actual position to each of the satellites Exact measurement of the time of flight the receiver correlates a pseudocode with the same code coming from the satellite The delay time for best correlation represents the time of flight. quartz clock on the GPS receivers are not very precise the range measurement with four satellite allows to identify the three values (x, y, z) for the position and the clock correction T Recent commercial GPS receiver devices allows position accuracies down to a couple meters.
dgps Position accuracy: typically from a few to a few tens of cm
Distributed Acoustic Sensing Application Requirements Cross-beam localization requires 3D Array Position Level Orientation Design constraints 3D terrain, 3D target locations Spacing requirement: 20+ meters Accuracy requirement 2 bearing, ±25 cm position Resilient to environment Ground foliage Background noise Weather conditions [From L. Girod and D. Estrin, UCLA-CENS]
Problem Statement Goal: Develop a self-calibrating system to support collaborative acoustic sensing applications, such as beam-forming and cross-beam localization. Target System: Input: Node placement: 3D, Outdoor, Foliage OK 20m Inter-node spacing Arrays are level Output: Estimates: XYZ Position ± 25cm Orientation ± 2 70x50m [From L. Girod and D. Estrin, UCLA-CENS] Results in James Reserve Accurate: Mean 3D Position Error: 20 cm Precise: Std. Dev. of Node Position: 18 cm
Position Estimation Application Emit Coded Signal Time-Synchronized Sampling Service Layer Select Code Detection Algorithm Detection Algorithm Ranging Layer Trigger <Code, Detect Time> Time Sync Control Traffic <Range, θ, φ> <Code, Detect Time> Multi-hop Network Layer Trigger <X, Y, Z, θ> <Range, θ, φ> Multilateration Layer [From L. Girod and D. Estrin, UCLA-CENS]
Acoustic Array Configuration 4 condenser microphones, arranged in a square with one raised 4 piezo tweeter emitters pointing outwards Array mounts on a tripod or stake, wired to CPU box Coordinate system defines angles relative to array (-4,4,14) 14cm 0 θ θ φ 0 φ 90 θ 8cm (-4,-4,0) [From L. Girod and D. Estrin, UCLA-CENS]
Experiments Component Testing Azimuth angle test Zenith angle test Range test Cement Wall 24 feet System Testing Court of Sciences Test James Reserve Test [From L. Girod and D. Estrin, UCLA-CENS]
Odometry
Motivating Example from two Weeks Ago: Behavior-Based Control with Motor Schemas Detect-obstacles Avoid-obstacle sensors Detect-Goal Move-to-Goal Σ actuators
Visualization of Vector Field for Ex. 5 Avoid-obstacle V magnitude = 0 S d G S R for for for S = obstacle s sphere of influence R = radius of the obstacle G = gain D = distance robot to obstacle s center V direction = radially along a line between robot and obst. center, directed away from the obstacle R d < d > d S R S Obstacle Obstacle
Visualization of Vector Field for Ex. 5 Move-to-goal Output = vector = (r,φ) (magnitude, direction) V magnitude = fixed gain value Goal V direction = towards perceived goal
Implementation of Move-to-Goal as Motivation: For moving from location A to location B (e.g. from a starting point to the goal location) do we need a GPS-like signal or a globally defined and perceivable field? No! We can reconstruct the global pose of a vehicle (position and orientation) using onboard, proprioceptive movement sensors For indoor and outdoor environments
Not only Robots Example: Cataglyphis desert ant Excellent study by Prof. R. Wehner (University of Zuerich, Emeritus) Individual foraging strategy Underlying mechanisms Internal compass (polarization of sun light) Dead-reckoning (path integration on neural chains for leg control) Local search (around 1-2 m from the nest) Extremely accurate navigation: averaged error of a few tens of cm over 500 m path!
Odometry: Principles Q: can we track the absolute position (global/environmental reference frame) based on movement information exclusively measured by on-board proprioceptive information? A: yes, using odometry! (and knowledge of initial position) Needed: DC motors + encoders (closed-loop control) or stepper motors (open-loop, but preestablished fixed increment per pulse, as on e- puck)
Optical Encoders Measure position (or speed) of the wheels Principle: mechanical light chopper consisting of photo-barriers (pair of light emitter and optical receiver) + pattern on a disc anchored to the motor shaft Quadrature encoder: 90º placement of 2 complete photo-barriers, 4x increase resolution + direction of movement Integrate wheel movements to get an estimate of the position -> odometry Typical resolutions: 64-2048 increments per revolution. For high resolution: interpolation
Position and Orientation of a Differential-Drive Robot y I Y I Y R y R P θ X R ξ I xi xr ξ R y = = R = R( θ ) ξ yi I θ θ cosθ sinθ 0 R( θ ) = sinθ cosθ 0 0 0 1 x I x R X I From Introduction to Autonomous Mobile Robots, Siegwart R. and Nourbakhsh I. R.
Absolute and Relative Velocity of a Differential-Drive Robot = = I I I I I I R R R x y y x y x θ θ θ & & & & & & & & & 1 0 0 0 0 1 0 1 0 X I Y I P X R Y R θ Ex. θ =π/2 R R ξ I θ ξ & & ) ( =
Y I Forward Kinematic Model How does the robot move given the wheel speeds and geometry? Assumption: no wheel slip! In miniature robots no major dynamic effects due to low mass l ω(t) 2 1 θ v(t) & ξ r (wheel radius) x& I y& I & θ I = = ϕ& i f l, r, θ, & ϕ 1, & ϕ ) ( 2 = wheel i speed X I
Forward Kinematic Model Linear velocity = average wheel speed 1 and 2: r & ϕ v = + 2 r & 2 1 ϕ 2 Rotational velocity = sum of rotation speeds (wheel 1 clockwise, wheel 2 counterclockwise): ω = r & ϕ 1 r & ϕ + 2 2l 2l Y I l Y R ω 2 P 1 r θ v X R Note: linear velocity equals rotational velocity times radius X I
+ + = l r l r r r I 2 2 0 2 2 1 0 0 0 cos sin 0 sin cos 2 1 2 1 ϕ ϕ ϕ ϕ θ θ θ θ ξ & & & & & I R ξ R θ ξ & & ) ( 1 = = 0 y& R 2 2 1 ϕ 2 ϕ & & & r r v x R + = = l r l r R 2 2 ϕ 1 ϕ 2 ω θ & & & + = = Forward Kinematic Model Y I X I v θ ω l r 1 X R Y R P 2 1. 2. 3. 4.
Odometry Q: given our absolute velocity, how can we calculate the robot position after some time t? A: integrate! Given the kinematic forward model, and assuming no slip on both wheels T T I I I I 0 0 1 ( T ) = + & dt = + R ( ) & ξ ξ ξ ξ θ ξ dt 0 0 Given an initial position and orientation ξ I0, after time T, the position and orientation of the vehicle will be ξ I (T) ξ I (T) computable with wheel speed 1, wheel speed 2, and parameters r and l Note: in practice wheel slippage always present positional error based on odometry is cumulative and incrementally increases R
Examples of Odometric Error Evolution for a Differential-Wheel Vehicle Notes: Gaussian assumption; ellipses shows 3σ bound Error on a straight line grows faster in a direction orthogonal to the trajectory Main axis of error ellipses not always perpendicular to the trajectory [From Introduction to Autonomous Mobile Robots, Siegwart R. and Nourbakhsh I. R.]
Conclusion
Take Home Messages A number of techniques for node localization have been developed Differences between indoor and outdoor techniques (range, infrastructure, accuracy) Some of them require external infrastructure (e.g. creation of an artificial field), some of them are based on complete on-board equipment (e.g. odometry) Different errors and costs characterized the different positioning and localization systems
Additional Literature Week 12 Books Siegwart R. and Nourbakhsh I. R., Introduction to Autonomous Mobile Robots, MIT Press, 2004. Borenstein J., Everett H. R., and Feng L. Navigating Mobile Robots: Systems and Techniques, A. K. Peters, Ltd., 1996. Everett H.R., Sensors for Mobile Robots, Theory and Applications, A. K. Peters, Ltd., 1995.