Visione per il veicolo Paolo Medici 2017/ Visual Perception

Similar documents
Choosing the Optimum Mix of Sensors for Driver Assistance and Autonomous Vehicles

23270: AUGMENTED REALITY FOR NAVIGATION AND INFORMATIONAL ADAS. Sergii Bykov Technical Lead Machine Learning 12 Oct 2017

Detection and Tracking of the Vanishing Point on a Horizon for Automotive Applications

Sensor Fusion for Navigation in Degraded Environements

Phase One 190MP Aerial System

Geometry of Aerial Photographs

GPS-Aided INS Datasheet Rev. 2.7

Test-bed for Unified Perception & Decision Architecture

Exercise questions for Machine vision

CMOS Image Sensors in Cell Phones, Cars and Beyond. Patrick Feng General manager BYD Microelectronics October 8, 2013

Building a Computer Vision Research Vehicle with ROS

GPS-Aided INS Datasheet Rev. 3.0

GPS-Aided INS Datasheet Rev. 2.6

LENSES. INEL 6088 Computer Vision

Colorado School of Mines. Computer Vision. Professor William Hoff Dept of Electrical Engineering &Computer Science.

A NEW NEUROMORPHIC STRATEGY FOR THE FUTURE OF VISION FOR MACHINES June Xavier Lagorce Head of Computer Vision & Systems

FLASH LiDAR KEY BENEFITS

White paper on CAR150 millimeter wave radar

Sony Releases the Industry's Highest Resolution Effective Megapixel Stacked CMOS Image Sensor for Automotive Cameras

Intelligent Transport Systems and GNSS. ITSNT 2017 ENAC, Toulouse, France 11/ Nobuaki Kubo (TUMSAT)

e2v Launches New Onyx 1.3M for Premium Performance in Low Light Conditions

Global Image Sensor Market with Focus on Automotive CMOS Sensors: Industry Analysis & Outlook ( )

White paper on CAR28T millimeter wave radar

and Vehicle Sensors in Urban Environment

1 st IFAC Conference on Mechatronic Systems - Mechatronics 2000, September 18-20, 2000, Darmstadt, Germany

pco.edge 4.2 LT 0.8 electrons 2048 x 2048 pixel 40 fps up to :1 up to 82 % pco. low noise high resolution high speed high dynamic range

TECHNOLOGY DEVELOPMENT AREAS IN AAWA

HiFi Radar Target. Kristian Karlsson (RISE)

LED flicker: Root cause, impact and measurement for automotive imaging applications

Metadata of the chapter that will be visualized online

A Positon and Orientation Post-Processing Software Package for Land Applications - New Technology

Perception platform and fusion modules results. Angelos Amditis - ICCS and Lali Ghosh - DEL interactive final event

Positioning Challenges in Cooperative Vehicular Safety Systems

Advances in Vehicle Periphery Sensing Techniques Aimed at Realizing Autonomous Driving

SIS63-Building the Future-Advanced Integrated Safety Applications: interactive Perception platform and fusion modules results

WHITE PAPER. Sensor Comparison: Are All IMXs Equal? Contents. 1. The sensors in the Pregius series

IHV means Independent Hardware Vendor. Example is Qualcomm Technologies Inc. that makes Snapdragon processors. OEM means Original Equipment

IMAGE ACQUISITION GUIDELINES FOR SFM

P1.4. Light has to go where it is needed: Future Light Based Driver Assistance Systems

FLCS V2.1. AHRS, Autopilot, Gyro Stabilized Gimbals Control, Ground Control Station

An Information Fusion Method for Vehicle Positioning System

ATLANS-C. mobile mapping position and orientation solution

GEO 428: DEMs from GPS, Imagery, & Lidar Tuesday, September 11

Applying Automated Optical Inspection Ben Dawson, DALSA Coreco Inc., ipd Group (987)

MIT CSAIL Advances in Computer Vision Fall Problem Set 6: Anaglyph Camera Obscura

Princeton University COS429 Computer Vision Problem Set 1: Building a Camera

Information & Instructions

HALS-H1 Ground Surveillance & Targeting Helicopter

Lecture 19: Depth Cameras. Kayvon Fatahalian CMU : Graphics and Imaging Architectures (Fall 2011)

TECHNICAL DATA. OPTIV CLASSIC 322 Version 3/2013

HIGHTS: towards sub-meter positioning accuracy in vehicular networks. Jérôme Härri (EURECOM) on Behalf of HIGHTS ETSI ITS Workshop March 6-8, 2018

interactive IP: Perception platform and modules

SMARTSCAN Smart Pushbroom Imaging System for Shaky Space Platforms

9/12/2011. Training Course Remote Sensing Basic Theory & Image Processing Methods September 2011

A NOVEL VISION SYSTEM-ON-CHIP FOR EMBEDDED IMAGE ACQUISITION AND PROCESSING

Development of a 24 GHz Band Peripheral Monitoring Radar

INPROX sensors. displacement MLS compact ccd-laser distance sensor

A LATERAL SENSOR FOR THE ALIGNMENT OF TWO FORMATION-FLYING SATELLITES

Technical Datasheet. Blaxtair is an intelligent cameraa with the ability to generate alarms when a pedestrian is detected

Probabilistic Robotics Course. Robots and Sensors Orazio

Systems characteristics of automotive radars operating in the frequency band GHz for intelligent transport systems applications

Automotive In-cabin Sensing Solutions. Nicolas Roux September 19th, 2018

PRODUCT OVERVIEW FOR THE. Corona 350 II FLIR SYSTEMS POLYTECH AB

REPLICATING HUMAN VISION FOR ACCURATE TESTING OF AR/VR DISPLAYS Presented By Eric Eisenberg February 22, 2018

DEFINING A SPARKLE MEASUREMENT STANDARD FOR QUALITY CONTROL OF ANTI-GLARE DISPLAYS Presented By Matt Scholz April 3, 2018

Introduction to Computer Vision

A Winning Combination

TECHNICAL DATA OPTIV CLASSIC 432

ADAS COMPUTER VISION AND AUGMENTED REALITY SOLUTION

Unit 1: Image Formation

Integration of Inertial Measurements with GNSS -NovAtel SPAN Architecture-

Automotive Image Sensors

Range Sensing strategies

Advanced Camera and Image Sensor Technology. Steve Kinney Imaging Professional Camera Link Chairman

Sensors and Sensing Cameras and Camera Calibration

NEOLINE. X-COP 9100s. International Hybrid device DVR with GPS & Radar detector

VisionMap Sensors and Processing Roadmap

SPEEDBOX Technical Datasheet

Inertial Systems. Ekinox Series TACTICAL GRADE MEMS. Motion Sensing & Navigation IMU AHRS MRU INS VG

Velodyne HDL-64E LIDAR for Unmanned Surface Vehicle Obstacle Detection

CODEVINTEC. Miniature and accurate IMU, AHRS, INS/GNSS Attitude and Heading Reference Systems

Inertial Sensors. Ellipse 2 Series MINIATURE HIGH PERFORMANCE. Navigation, Motion & Heave Sensing IMU AHRS MRU INS VG

White Paper Thermal: Detection, recognition, and identification

GNSS in Autonomous Vehicles MM Vision

Inertial Sensors. Ellipse 2 Series MINIATURE HIGH PERFORMANCE. Navigation, Motion & Heave Sensing IMU AHRS MRU INS VG

Passive Components around ADAS Applications By Ron Demcko, AVX Fellow, AVX Corporation

Book Cover Recognition Project

Optical basics for machine vision systems. Lars Fermum Chief instructor STEMMER IMAGING GmbH

Leica - 3 rd Generation Airborne Digital Sensors Features / Benefits for Remote Sensing & Environmental Applications

High Precision GNSS in Automotive

Image Formation and Capture. Acknowledgment: some figures by B. Curless, E. Hecht, W.J. Smith, B.K.P. Horn, and A. Theuwissen

USB components. Multi-Sensor Cameras. Camera Configuration. Available Sensor Board Designs. Options. Base unit and sensor boards

A 3D Multi-Aperture Image Sensor Architecture

Inertial Sensors. Ellipse Series MINIATURE HIGH PERFORMANCE. Navigation, Motion & Heave Sensing IMU AHRS MRU INS VG

Specifications for Fujifilm FinePix F850EXR 16MP Digital Camera

University Of Lübeck ISNM Presented by: Omar A. Hanoun

Vehicle Level Antenna Pattern & ADAS Measurement

Building a Real Camera. Slides Credit: Svetlana Lazebnik

NOVA S12. Compact and versatile high performance camera system. 1-Megapixel CMOS Image Sensor: 1024 x 1024 pixels at 12,800fps

Invited talk IET-Renault Workshop Autonomous Vehicles: From theory to full scale applications Novotel Paris Les Halles, June 18 th 2015

Transcription:

Visione per il veicolo Paolo Medici 2017/2018 02 Visual Perception

Today Sensor Suite for Autonomous Vehicle ADAS Hardware for ADAS

Sensor Suite Which sensor do you know? Which sensor suite for Which algorithms are required for 1. Driving in the desert 2. Driving in simplified urban environment 3. Driving in normal urban environment

Sensor suite Sensor Suite for Autonomous driving: Stanley Boss BRAIVE VIAC KIT Google Tesla LIDARs for Autonomous Driving ADAS Cameras for Autonomous Driving

Stanley - 5 LIDAR - 1 camera - 1 RADAR - GPS/IMU http://isl.ecst.csuchico.edu/docs/darpa2005/darpa%202005%20stanley.pdf

Boss

Boss http://www.fieldrobotics.org/users/alonzo/pubs/papers/jfr_08_boss.pdf

Boss Applanix POS-LV 220/420 GPS/IMU (APLX) Submeter accuracy with Omnistar VBS corrections Tightly coupled inertial/gps bridges GPS outages SICK LMS 291-S05/S14 LIDAR (LMS) 180/90 deg 0.9 deg FOV with 1/0.5-deg angular resolution 80-m maximum range Velodyne HDL-64 LIDAR (HDL) 360 26-deg FOV with 0.1-deg angular resolution 70-m maximum range Continental ISF 172 LIDAR (ISF) 12 3.2 deg FOV 150-m maximum range IBEO Alasca XT LIDAR (XT) 240 3.2 deg FOV 300-m maximum range Continental ARS 300 Radar (ARS) 60/17 deg 3.2 deg FOV 60-m/200-m maximum range Point Grey Firefly (PGF) High-dynamic-range camera 45-deg FOV

The BRAiVE sensing technology Front sensing 4 cameras (2 graylevel, 2 color)

The BRAiVE sensing technology Lateral sensing

The BRAiVE sensing technology Rear sensing

The BRAiVE sensing technology Back sensing Stereo cameras

BRAiVE all-round vision coverage

The BRAiVE sensing technology Single plane laser scanners 2 frontal, 1 backward

The BRAiVE sensing technology Multiplane laser scanner

The BRAiVE sensing technology 16 Laser beams

The BRAiVE sensing technology DGPS + IMU

BRAiVE s processing BRAiVE s data processing is performed by 4 PCs Each PC is in charge of specific sensing areas One PC is in charge of vehicle control

VIAC The Sensing Suite 7 cameras 4 laserscanners GPS V2V radio + Additional devices

KIT Two stereo rigs (1392 512 px, 54 cm base, 90 opening) Velodyne HDL-64E laser scanner GPS+IMU localization

KIT 2 PointGray Flea2 gray scale cameras(fl2-14s3m-c), 1.4 Megapixels, 1/2 Sony ICX267 CCD, global shutter 2 PointGray Flea2 color cameras(fl2-14s3c-c), 1.4 Megapixels, 1/2 Sony ICX267 CCD, global shutter 4 Edmund Optics lenses, 4mm, opening angle 90, vertical opening angle of region of interest (ROI) 35 1 Velodyne HDL-64Erotating 3D laser scanner, 10 Hz, 64 beams, 0.09 degree angular resolution, 2 cm distance accuracy, collecting 1.3 million points/second, field of view: 360 horizontal, 26.8 vertical, range: 120 m 1 OXTS RT3003inertial and GPS navigation system, 6 axis, 100 Hz, L1/L2 RTK, resolution: 0.02m / 0.1

All heights wrt. road surface 1.60 m Wheel axis (height: 0.30m) 0.06 m 0.54 m All camera heights: 1.65 m Cam 1 (gray) 0.06 m Cam 3 (color) Cam-to-Cam Rect Velodyne laserscanner & CamRect (height: 1.73 m) Cam 0 (gray) Cam 2 (color) z 1.68 m 0.80 m -to-image x y x 0.27 m z y Velo-to-Cam IMU-to-Velo GPS/IMU x (height: 0.93 m) 0.81 m 0.05 m z y 0.32 m 0.48 m 2.71 m Figure :Sensor Setup. dimensions and mounting positions of the sensors (red) with respect to the vehicle body. Heights above ground are marked in green and measured with respect to the road surface. Transformations between sensors are shown in blue. Jan 12, 2016 CSC 2541: 01-Introduction

What s the problem of using so many sensors? One has to Calibrate and Registered them Different 3D locations Different capture times Different types of capture: instantaneous vs scanning

Google Car

Google Car Play Video

Google

Google

Velodyne LIDAR

Velodyne HDL64 LIDAR

Different Velodyne LIDARs

IBEO LIDARs

Tesla Sensor Suite

Tesla Sensor Suite A forward radar A forward-looking camera 12 long-range ultrasonic sensors positioned to sense 16 feet around the car in every direction at all speeds GPS A high-precision digitally-controlled electric assist breaking system Autopilot is on the Market on Model S

ADAS Active Park Assist Lane Departure Warning Traffic Sign Recognition Adaptive Cruise Control / ACC Stop & Go Forward Collision Warning / Emergency Breaking Blind Spot Detection Intelligent HeadLamp Control Pedestrian Detection

ADAS

ADAS

ADAS

ADAS

ADAS

Camera Sensor Resolution Pitch Size Technology Sensitivity Lens FOV Aperture Automotive Hardware problem

Autonomotive Hardware AEC-Q100 standard Operating temperature Grade 0: -40 C to +150 C Grade 1: -40 C to +125 C Grade 2: -40 C to +105 C Grade 3: -40 C to +85 C Grade 4: 0 C to +70 C Storage Temperature (higher!) Mechanical Shock Vibration

CCD Good Sensitivity Optical Blooming!

CCD vs CMOS

Rolling Shutter

Rolling vs Global

Rolling Shutter

Rolling Shutter

Rolling Shutter Each image row (pixel) has different time Pixels of dewarped image have a complex time equation Precise disparity on rectified image is impossible

Aperture vs Shutter f/ lens aperture: Depth of Field Light = Lens Aperture = proportional to 1/f 2 Shutter (Exposure) time: light acquired by pixel proportional to shutter

Light Conversion Shutter: light vs motion blur light α shutter Aperture: light vs depth of field light α 1/f 2 Pixel size (Pitch size): light vs resolution light α pitch 2 Sensitivity/Capacity

Dynamic Range problem DAY: >10^5 lux NIGHT: <10^-1 lux Dynamic Range: ~120db

Dynamic Range 8 12 bit ADC: 8 bit 256:1 48db 10bit 1024:1 60db 12bit 4096:1 72db... - 20bit 1M:1 120db Hardware vs Multiple Shot Non Linear Mapping Local Mapping (ToneMapping )

HDR Hardware

HDR MultiShot Images copyright Vislab/Ambarella Inc.

HDR Blending X1 =X1+noise X2 =X2+noise A = T2/T1 X1 = A * X1 X =X2 *f2+x1 *f1

Video Sensors Good Image Quality: High Sensitivity over a wide Spectrum and Wide Dynamical Range broad temperature range: -40degC... +105degC Some applications also require Color Global Shutter = Expensive! Rolling Shutter = Distortion!

WindShield distortion Lens distortion model is radial Windshild distortion is not radial! Spline?

Additional Issue Thermal stability: lens parameters, calibration parameters can change during time due to temperature changes. Real Time Calibration? Autocalibration

Hardware for ADAS Last Challenge: Energy Efficiency AlphaGo: 1920 CPUs and 280 GPUs, $3000 electric bill per game on mobile: drains battery on data-center: latency? increases TCO