Building a Computer Vision Research Vehicle with ROS

Similar documents
Choosing the Optimum Mix of Sensors for Driver Assistance and Autonomous Vehicles

Visione per il veicolo Paolo Medici 2017/ Visual Perception

Perception platform and fusion modules results. Angelos Amditis - ICCS and Lali Ghosh - DEL interactive final event

SIS63-Building the Future-Advanced Integrated Safety Applications: interactive Perception platform and fusion modules results

interactive IP: Perception platform and modules

Exploring Pedestrian Bluetooth and WiFi Detection at Public Transportation Terminals

An Information Fusion Method for Vehicle Positioning System

Practical Experiences on a Road Guidance Protocol for Intersection Collision Warning Application

White paper on CAR28T millimeter wave radar

A Winning Combination

DENSO

OPEN CV BASED AUTONOMOUS RC-CAR

Formation and Cooperation for SWARMed Intelligent Robots

23270: AUGMENTED REALITY FOR NAVIGATION AND INFORMATIONAL ADAS. Sergii Bykov Technical Lead Machine Learning 12 Oct 2017

NCCT IEEE PROJECTS ADVANCED ROBOTICS SOLUTIONS. Latest Projects, in various Domains. Promise for the Best Projects

Global Image Sensor Market with Focus on Automotive CMOS Sensors: Industry Analysis & Outlook ( )

* Intelli Robotic Wheel Chair for Specialty Operations & Physically Challenged

Driver status monitoring based on Neuromorphic visual processing

CANopen Programmer s Manual Part Number Version 1.0 October All rights reserved

Positioning Challenges in Cooperative Vehicular Safety Systems

David Howarth. Business Development Manager Americas

Transportation Informatics Group, ALPEN-ADRIA University of Klagenfurt. Transportation Informatics Group University of Klagenfurt 3/10/2009 1

HIGHTS: towards sub-meter positioning accuracy in vehicular networks. Jérôme Härri (EURECOM) on Behalf of HIGHTS ETSI ITS Workshop March 6-8, 2018

Intelligent driving TH« TNO I Innovation for live

Tomasz Włostowski Beams Department Controls Group Hardware and Timing Section. Trigger and RF distribution using White Rabbit

CANopen Programmer s Manual

ReVRSR: Remote Virtual Reality for Service Robots

GNSS in Autonomous Vehicles MM Vision

Automation and Mechatronics Engineering Program. Your Path Towards Success

Traffic Management for Smart Cities TNK115 SMART CITIES

Transformation to Artificial Intelligence with MATLAB Roy Lurie, PhD Vice President of Engineering MATLAB Products

Open Source Voices Interview Series Podcast, Episode 03: How Is Open Source Important to the Future of Robotics? English Transcript

Semi-Autonomous Parking for Enhanced Safety and Efficiency

Embedding Artificial Intelligence into Our Lives

VSI Labs The Build Up of Automated Driving

Team KMUTT: Team Description Paper

CANopen Programmer s Manual

Virtual Testing of Autonomous Vehicles

Digital Engines for Smart and Connected Cars By Bob O Donnell, TECHnalysis Research Chief Analyst

Situational Awareness A Missing DP Sensor output

GPS Waypoint Application

Deployment and Testing of Optimized Autonomous and Connected Vehicle Trajectories at a Closed- Course Signalized Intersection

Project Overview Mapping Technology Assessment for Connected Vehicle Highway Network Applications

TECHNOLOGY DEVELOPMENT AREAS IN AAWA

Automatic correction of timestamp and location information in digital images

Following Dirt Roads at Night-Time

Research Proposal: Autonomous Mobile Robot Platform for Indoor Applications :xwgn zrvd ziad mipt ineyiil zinepehe`e zciip ziheaex dnxethlt

League <BART LAB AssistBot (THAILAND)>

ARCHITECTURE AND MODEL OF DATA INTEGRATION BETWEEN MANAGEMENT SYSTEMS AND AGRICULTURAL MACHINES FOR PRECISION AGRICULTURE

Weekly report: January 11 - January 18, 2018

Affordable Real-Time Vision Guidance for Robot Motion Control

Prof. Subramanian Ramamoorthy. The University of Edinburgh, Reader at the School of Informatics

A NEW NEUROMORPHIC STRATEGY FOR THE FUTURE OF VISION FOR MACHINES June Xavier Lagorce Head of Computer Vision & Systems

CS686: High-level Motion/Path Planning Applications

Roadside Range Sensors for Intersection Decision Support

Virtual Homologation of Software- Intensive Safety Systems: From ESC to Automated Driving

Independent Work Report Spring, TrafficAssist. A monocular camera based approach to identifying. traffic situations

TDOA-Based Localization Using Distributed Sensors Based on Commodity Hardware. EW Europe 2017 London

Emergency Stop Final Project

Vehicle-to-X communication using millimeter waves

Integrated Driving Aware System in the Real-World: Sensing, Computing and Feedback

Intelligent Technology for More Advanced Autonomous Driving

White paper on CAR150 millimeter wave radar

Industrial Keynotes. 06/09/2018 Juan-Les-Pins

Fusion in EU projects and the Perception Approach. Dr. Angelos Amditis interactive Summer School 4-6 July, 2012

P1.4. Light has to go where it is needed: Future Light Based Driver Assistance Systems

Distributed Virtual Environments!

Lane Detection in Automotive

Evaluation of Connected Vehicle Technology for Concept Proposal Using V2X Testbed

Image Processing Based Autonomous Bradley Rover

I I. Technical Report. "Teaching Grasping Points Using Natural Movements" R R. Yalım Işleyici Guillem Alenyà

GNSS applications in Flight Test Instrumentation Systems. R. Urli

Survey on ODX (open diagnostics data exchange)

Evaluating Requirements of High Precision Time Synchronisation Protocols using Simulation

Hello, and welcome to this presentation of the STM32 Digital Filter for Sigma-Delta modulators interface. The features of this interface, which

products PC Control

Autonomous Golf Cart Navigation Using ROS

Figure 1 HDR image fusion example

Bluetooth Low Energy Sensing Technology for Proximity Construction Applications

Revision of the EU General Safety Regulation and Pedestrian Safety Regulation

DLR s ROboMObil HIL Simulator Using FMI 2.0 Technology on dspace SCALEXIO Real-time Hardware. Andreas Pillekeit - dspace. Jonathan Brembeck DLR

Application Note. Thickness measurement with two sensors

Machine Vision for the Life Sciences

Sensing, Computing, Communication

Portfolio. Swaroop Kumar Pal swarooppal.wordpress.com github.com/swarooppal1088

NAV CAR Lane-sensitive positioning and navigation for innovative ITS services AMAA, May 31 st, 2012 E. Schoitsch, E. Althammer, R.

Real Time Traffic Light Control System Using Image Processing

Combining ROS and AI for fail-operational automated driving

NEOLINE. X-COP 9100s. International Hybrid device DVR with GPS & Radar detector

TETRA data services: Applications & innovations

Optics and Photonics Used in Road Transportation

EX FEATURES. Stand-alone 48-channel unit with built-in Ethernet controller. Built-in bridge completion and Excitation

EOL and HIL Tests for Automotive Radar

Designing A Human Vehicle Interface For An Intelligent Community Vehicle

Minimizing Distraction While Adding Features

Swarm Robotics. Communication and Cooperation over the Internet. Will Ferenc, Hannah Kastein, Lauren Lieu, Ryan Wilson Mentor: Jérôme Gilles

Optimal Driving System for Two Wheelers

IEEE1588 V2 Clock Distribution in FlexRIO Devices: Clock Drift Measurement

Computer vision, wearable computing and the future of transportation

Need the fourth screen here with pixel readouts - check wilth Bilbo. Waveform, Histogram and Vectorscope Monitoring for HDR

Transcription:

Building a Computer Vision Research Vehicle with ROS ROSCon 2017 2017-09-21 Vancouver Andreas Fregin, Markus Roth, Markus Braun, Sebastian Krebs & Fabian Flohr

Agenda 1. Introduction 2. History 3. Triggering a Heterogeneous Sensor Setup 4. Our Calibration Solution 5. Enhancing ROS Tools / Handling Data 6. Q&A Building a Computer Vision Research Vehicle with ROS Andreas Fregin 2017-09-21 Page 2

About Us Daimler is the corporate parent of Mercedes-Benz. The authors started in team Pattern Recognition and Cameras as PhDs. Main research topics: pedestrian intention recognition, traffic light recognition. Interests: object recognition from camera images, machine learning. Using ROS as research framework for computer vision. Sebastian Krebs Markus Braun Andreas Fregin Markus Roth Fabian Flohr Building a Computer Vision Research Vehicle with ROS Andreas Fregin 2017-09-21 Page 3

How I came to ROS 2015 Daimler 2011/12 University RoboCup@Work Need for a framework Used est. automotive framework Missed simplicity, introspection and especially the doc. (wiki) of ROS Came back to ROS RoboCup@Work: Basic Transportation Test Precision Placement Test / League Winners Building a Computer Vision Research Vehicle with ROS Andreas Fregin 2017-09-21 Seite 4

Daimlers History in ADAS & Autonomous Driving Research Active Safety Traffic Management Emergency Call Lane Departure Warning Stop & Go Assist Blind Spot Warner Speech In/Output Emergency Breaking Assist Speed Limit Assist Adaptive Cruise Control Head-up Display Night View Lane Keeping Assist Attention Assist Digital Map RDS-TMC Dynamic Navigation Travel Information Services Adaptive LSA Mobile Service PTA Strategy Management Online Services Floating Car Data Autonomous Driving Building a Computer Vision Research Vehicle with ROS Andreas Fregin 2017-09-21 Page 5

Our ROSified Research Vehicles Building a Computer Vision Research Vehicle with ROS Andreas Fregin 2017-09-21 Page 6

Universal CAN Message Decoder Message generator for CAN-bus messages auto. generation gencpp, genpy, genlisp, CAN msg. description.msg files client library msgs Decoding/Publishing w221_body_can/velocity w221_body_can/yaw_rate raw CAN msg. decoder w221_body_can/stw_angle CAN hardware w221_body_can/radar_objects Building a Computer Vision Research Vehicle with ROS Andreas Fregin 2017-09-21 Page 7

Enabling Low-Level Sensor Fusion Target Capture surrounding at the same moment in time across different sensors Precisely time-stamp sensor-readings Constraints Heterogeneous sensors Different sensor nodes Maybe different cycle rates Unstamped sensor data from CAN-bus Building a Computer Vision Research Vehicle with ROS Andreas Fregin 2017-09-21 Page 8

Software Triggering Use the host PC to software-trigger all sensors CAN bag_rec Sensor Data camera1 camera2 CAN Busses camera3 camera4 Building a Computer Vision Research Vehicle with ROS Andreas Fregin 2017-09-21 Page 9

Hardware Triggering Use a trigger generator to hardware-trigger all sensors Same acquisition time but when? Sensor data Trigger signal Unstamped data Stamped data CAN bag_rec CAN Busses camera1 camera2 camera3 camera4 Building a Computer Vision Research Vehicle with ROS Andreas Fregin 2017-09-21 Page 10

Sensors do not know about reference time The exposure was triggered at the exact same moment, so the data (images) show the same content Processing time of heterogeneous setups will vary ->data (images) arrive at different moments in time Timestamping using ros::time::now() will result in different timestamps Timestamping using ros::time::now() is not correct (arrival vs. acquisition!) We need to know the moment of triggering in reference time Building a Computer Vision Research Vehicle with ROS Andreas Fregin 2017-09-21 Page 11

PTP Time-Sync (Precision Time Protocol, IEEE1588) Trigger (Microcontroller) does not know about reference time Time-Synchronization: STM32F4 + LWIP + ROSUDP + PTPd With each trigger signal, also a trigger message is generated Publishing trigger message (std_msgs/header) Sensor nodes receive the trigger message before the sensor data arrives: proper timestamped images, while ensuring all different sensor data have the exact same timestamp! Result: All camera images show the exact same moment AND we know the timestamp of that moment Building a Computer Vision Research Vehicle with ROS Andreas Fregin 2017-09-21 Page 12

Hardware Triggering with known time Microcontroller does know about reference time (via PTP from PC) Microcontroller publishes trigger as ROS std_msgs/header Sensor data Trigger signal Trigger ROS topic S Unstamped data Stamped data CAN Trigger bag_rec camera1 camera2 camera3 M CAN Busses camera4 Building a Computer Vision Research Vehicle with ROS Andreas Fregin 2017-09-21 Page 13

Synchronization with Velodyne LiDARs Microcontroller is now PTP time master using GPS time (NMEA string parsing) PC is PTP time slave Sensor data Trigger signal Trigger ROS topic M Unstamped data Stamped data velodyne GPS Trigger camera1 camera2 camera3 CAN bag_rec S CAN Busses camera4 Building a Computer Vision Research Vehicle with ROS Andreas Fregin 2017-09-21 Page 14

Calibration Whenever you fuse data you need to know about times AND coordinate frames Even small errors (sub-decimal) in orientation result in huge position errors for distant objects We need a good extrinsic calibration Cameras Laser scanners Building a Computer Vision Research Vehicle with ROS Andreas Fregin 2017-09-21 Page 15

Intrinsic Camera Calibration cameracalibrator.py comes with OpenCV checkerboard-detector Has a informative UI that teaches you where to hold the checkerboard (X/Y/Size) Does pick the images from running video: user doesn t has the chance to hold still to avoid motion blur, etc. Does not allow to modify data that is used for the calibration step Does not generate a sensor-to-car transformation Building a Computer Vision Research Vehicle with ROS Andreas Fregin 2017-09-21 Page 16

Calibration Requirements One-man show On demand checkerboard detection Live detection inspection Remove images Add specific images Building a Computer Vision Research Vehicle with ROS Andreas Fregin 2017-09-21 Page 17

Server-Client Calibration using Car-PC + Linux-Tablet Topics total_st. tf camera image{_left/_right} cal-server image(_sbs)/compressed cal-gui Services cal-server Trigger detection Trigger calibration delete detection load data save data cal-gui Car Tablet Building a Computer Vision Research Vehicle with ROS Andreas Fregin 2017-09-21 Page 18

Timeshift Recording Good example for ROS-tool enhancement Start recording in the past RAM Buffer Trigger topic (delayed start/stop) Seconds of rec. Minutes of rec. Hours of rec. Rosbag player enhancements Step-topic (play/pause) Triggered playback Building a Computer Vision Research Vehicle with ROS Andreas Fregin 2017-09-21 Page 19

Rewriting Rosbags Don t be afraid using rospy.rosbag to modify existing ROSbags add sensor data add TF (Calibration) add ground truth correct data (e.g. frame_ids, image_encodings, ) Building a Computer Vision Research Vehicle with ROS Andreas Fregin 2017-09-21 Page 20

Powerful tools / packages Setting up complex image processing setups using nodelets Strongly typed messages lead to node exchangeability Example: different detectors all use the same in/output messages Extremely powerful packages like image_geometry speed up research Having tf as the transformation central Launch system is very helpful (especially including other launch files) Diagnostics capabilities, Building a Computer Vision Research Vehicle with ROS Andreas Fregin 2017-09-21 Page 21

Our lessons learned ROS already includes the concepts to realize complex, heterogeneous sensor setups ROS can handle high data throughput and high cycle rates ROS is a good starting point for handling large data If your needs exceed what ROS comes with extend it! Building a Computer Vision Research Vehicle with ROS Andreas Fregin 2017-09-21 Page 22

Questions? Building a Computer Vision Research Vehicle with ROS Andreas Fregin 2017-09-21 Page 23