A NEW NEUROMORPHIC STRATEGY FOR THE FUTURE OF VISION FOR MACHINES June Xavier Lagorce Head of Computer Vision & Systems

Similar documents
23270: AUGMENTED REALITY FOR NAVIGATION AND INFORMATIONAL ADAS. Sergii Bykov Technical Lead Machine Learning 12 Oct 2017

Introduction to Computer Vision

Global Image Sensor Market with Focus on Automotive CMOS Sensors: Industry Analysis & Outlook ( )

Choosing the Optimum Mix of Sensors for Driver Assistance and Autonomous Vehicles

Vision with Precision Webinar Series Augmented & Virtual Reality Aaron Behman, Xilinx Mark Beccue, Tractica. Copyright 2016 Xilinx

Visione per il veicolo Paolo Medici 2017/ Visual Perception

Automotive In-cabin Sensing Solutions. Nicolas Roux September 19th, 2018

Neuromorphic Event-Based Vision Sensors

LED flicker: Root cause, impact and measurement for automotive imaging applications

P1.4. Light has to go where it is needed: Future Light Based Driver Assistance Systems

Invited talk IET-Renault Workshop Autonomous Vehicles: From theory to full scale applications Novotel Paris Les Halles, June 18 th 2015

Hochperformante Inline-3D-Messung

THE VISIONLAB TEAM engineers - 1 physicist. Feasibility study and prototyping Hardware benchmarking Open and closed source libraries

AI Application Processing Requirements

A Winning Combination

CMOS Image Sensors in Cell Phones, Cars and Beyond. Patrick Feng General manager BYD Microelectronics October 8, 2013

Situational Awareness A Missing DP Sensor output

SIS63-Building the Future-Advanced Integrated Safety Applications: interactive Perception platform and fusion modules results

White paper. Low Light Level Image Processing Technology

e2v Launches New Onyx 1.3M for Premium Performance in Low Light Conditions

ISTAR Concepts & Solutions

Development of a 24 GHz Band Peripheral Monitoring Radar

Silicon radars and smart algorithms - disruptive innovation in perceptive IoT systems Andy Dewilde PUBLIC

Making Vehicles Smarter and Safer with Diode Laser-Based 3D Sensing

Copyright T. Delbruck,

Figure 1 HDR image fusion example

CAPACITIES FOR TECHNOLOGY TRANSFER

Next-generation automotive image processing with ARM Mali-C71

KÜNSTLICHE INTELLIGENZ JOBKILLER VON MORGEN?

Lecture 19: Depth Cameras. Kayvon Fatahalian CMU : Graphics and Imaging Architectures (Fall 2011)

Virtual Homologation of Software- Intensive Safety Systems: From ESC to Automated Driving

sensors & systems Imagine future imaging... Leti, technology research institute Contact:

CS686: High-level Motion/Path Planning Applications

Advanced Technologies & Intelligent Autonomous Systems in Alberta. Ken Brizel CEO ACAMP

Next-generation automotive image processing with ARM Mali-C71

I E E E 5 G W O R L D F O R U M 5 G I N N O V A T I O N S & C H A L L E N G E S

Neural Networks The New Moore s Law

SEAVENTION AUTONOMOUS SUBSEA INTERVENTION

ADAS Development using Advanced Real-Time All-in-the-Loop Simulators. Roberto De Vecchi VI-grade Enrico Busto - AddFor

A Motion Sensor with On-Chip Pixel Rendering Module for Optical Flow Gradient Extraction

Human-Centric Trusted AI for Data-Driven Economy

By Pierre Olivier, Vice President, Engineering and Manufacturing, LeddarTech Inc.

CIS 849: Autonomous Robot Vision

NCCT IEEE PROJECTS ADVANCED ROBOTICS SOLUTIONS. Latest Projects, in various Domains. Promise for the Best Projects

Perception platform and fusion modules results. Angelos Amditis - ICCS and Lali Ghosh - DEL interactive final event

Industrial Keynotes. 06/09/2018 Juan-Les-Pins

Embedding Artificial Intelligence into Our Lives

The 3xD Simulator for Intelligent Vehicles Professor Paul Jennings. 20 th October 2016

Sony Releases the Industry's Highest Resolution Effective Megapixel Stacked CMOS Image Sensor for Automotive Cameras

Technical Datasheet. Blaxtair is an intelligent cameraa with the ability to generate alarms when a pedestrian is detected

WHITE PAPER. Sensor Comparison: Are All IMXs Equal? Contents. 1. The sensors in the Pregius series

Technology Trends with Digital Transformation

Réunion : Projet e-baccuss

Get your daily health check in the car

VGA CMOS Image Sensor

UXGA CMOS Image Sensor

FLASH LiDAR KEY BENEFITS

Autonomous Face Recognition

Advances in Vehicle Periphery Sensing Techniques Aimed at Realizing Autonomous Driving

Prospective Teleautonomy For EOD Operations

Affordable Real-Time Vision Guidance for Robot Motion Control

HeroX - Untethered VR Training in Sync'ed Physical Spaces

High-Speed 3D Sensor with Micrometer Resolution Ready for the Production Floor

Building a Computer Vision Research Vehicle with ROS

SUPER RESOLUTION INTRODUCTION

Improved sensitivity high-definition interline CCD using the KODAK TRUESENSE Color Filter Pattern

The Future is Now: Are you ready? Brian David

VGA CMOS Image Sensor BF3005CS

What s Hot? The M&A and Funding Landscape for Embedded Vision Companies

Technology trends in the digitalization era. ANSYS Innovation Conference Bologna, Italy June 13, 2018 Michele Frascaroli Technical Director, CRIT Srl

EE631 Cooperating Autonomous Mobile Robots. Lecture 1: Introduction. Prof. Yi Guo ECE Department

International Journal of Innovative Research in Engineering Science and Technology APRIL 2018 ISSN X

* Intelli Robotic Wheel Chair for Specialty Operations & Physically Challenged

Cognitive robots and emotional intelligence Cloud robotics Ethical, legal and social issues of robotic Construction robots Human activities in many

Introduction. Ioannis Rekleitis

Summary of robot visual servo system

Driver Assistance Systems (DAS)

velociraptor HS Velociraptor is fast running and fast grabbing! Save a tree...please don't print this document unless you really need to.

Bluetooth Low Energy Sensing Technology for Proximity Construction Applications

FORESIGHT AUTONOMOUS HOLDINGS NASDAQ/TASE: FRSX. Investor Conference. December 2018

Use of Photogrammetry for Sensor Location and Orientation

Machine Vision for the Life Sciences

Safe, Efficient and Effective Testing of Connected and Autonomous Vehicles Paul Jennings. Franco-British Symposium on ITS 5 th October 2016

Virtual Testing of Autonomous Vehicles

Transportation Informatics Group, ALPEN-ADRIA University of Klagenfurt. Transportation Informatics Group University of Klagenfurt 3/10/2009 1

UbiMobility 2017: Business France and Bpifrance Accelerate 8 Innovative Startups and SMEs in the US Autonomous Vehicle Sector

Digital image processing vs. computer vision Higher-level anchoring

ITS Radiocommunications in Japan Progress report and future directions

Technologies Explained PowerShot D20

Computer Vision Introduction

White paper on CAR28T millimeter wave radar

GNSS in Autonomous Vehicles MM Vision

High-Dynamic-Range (HDR) Vision

Computer vision, wearable computing and the future of transportation

Day One 13 March Day Two 14 March 2019

ADAS COMPUTER VISION AND AUGMENTED REALITY SOLUTION

For personal use only

Technology offer. Low cost system for measuring vibrations through cameras

How Connected Mobility Technology Is Driving The Future Of The Automotive Industry

Real-Time Face Detection and Tracking for High Resolution Smart Camera System

Transcription:

A NEW NEUROMORPHIC STRATEGY FOR THE FUTURE OF VISION FOR MACHINES June 2017 Xavier Lagorce Head of Computer Vision & Systems

Imagine meeting the promise of Restoring sight to the blind Accident-free autonomous vehicles High-speed collision avoidance Harmonious human/robot collaboration Surveillance without power drain This is reality for

A paradigm shift is coming to computer vision

TECHNOLOGY MARKET A 4 th disruption in image sensors SENSING Film Photography Digital Photography Mobile Photography 1865 1945 2005 2015 2020 TUBES CCD CMOS CMOS ( Frame-based Event-based

From Imaging «frames» Adapted for static images, an impossible trade off power vs frame rate Data redundancy Information loss Light-dependent

to Sensing «events» By capturing only changes in a scene, event-based computer vision is optimized for dynamic applications Redundancy-free 1000x less data Ultra high speed Microseconds precision Wide dynamic range 140+ db

Computer vision market: ~$50B by 2022 TOTAL UNITS(*) PROJECTED REVENUE GROWTH ($B) Mobile Automotive Consumer Industrial Automation Wearable Surveillance Robotics 6B 300M 200M 150M 150M 125M 20M 60 50 40 30 20 10 Medical Devices 6M (*) vision sensors in units sold in 2020 Sources: Tractica, Yole, M&M, Internal Analyses 0 2017 2018 2019 2020 2021 2022 Automotive Sports & Entertainment Consumer Robotics & Machine Vision Medical Security & Surveillance Retail Agriculture

A complete event-based computer vision solution AUTO M O TIVE: $8B in 2020 5Y CAGR: 44.8% Examples: Collision Warning Line Warning Detection Sign Recognition Driver Assistance & Monitoring I NDUSTRIAL: $6B in 2020 5Y CAGR: 31.1% Examples: Inspection Autonomous Guided Vehicles Collaborative Robots Pick & Place SMART IOT: $10B in 2020 5Y CAGR: 33.2% Examples: Smart City - monitoring Smart Home wake-up Smart workplace security & surveillance PROSUMER: $6B in 2020 5Y CAGR: 22.5% Examples: AR/VR/MR Wearables Health Monitoring Sources: Tractica, Yole, M&M, Internal Analyses

Event-based computer vision

Computer vision: Inspired by Biology More efficient visual information acquisition Biological vision does not use images to see Machine vision needs vision, not images Event-based vision uses pixels to capture relevant information and only the changes in a scene

Pixel controlled sensor: adapted for dynamic scenes Log pixel illuminance change detection events change events greyscale events (TCDS pairs) t int grey level ~ 1/t int graylevel events Each pixel individually controls its own sampling rate Active when signal changes Inactive when no changes Addr Y Time Stamp 3 Y arbiter 4 Ack Y Req Y 2 7 Ack X Addr X 6 1 Req X 5 X arbiter What this means: Auto-sampling of pixels Pixel-individual optimization of sampling Zero-redundancy sampling Time-domain encoding of exposure Results High-speed response (sub-millisecond) Low data rate (10-1000x less data) Wide dynamic range (120-140 db) Low-power operation (<10mW, QVGA) Benefits Real-time vision processing: tracking, motion flow, 3D reconstruction, with millisecond to microsecond update rates

Event imaging // Frames are absent from the acquisition process STANDARD CAMERA FIXED SAMPLING RATE µs EVENTS SAMPLING

Event imaging impact: Low Bandwidth STANDARD CAMERA CONSTANT HIGH BANDWIDTH NEEDS DECODE/ENCODE TO STREAM SCENE-OPTIMIZED BANDWIDTH STREAM CAN BE PROCESSED DIRECTLY

Event Imaging impact: Ultra-High Speed STANDARD CAMERA Latency of ~1ms based on sensor s acquisition speed Latency (at best) THE SLOWEST PIXEL ACQUISITION ASYNCHRONOUS PIXEL

Pixel exposure impact: High Dynamic Range STANDARD CAMERA lux lux t t Exposure time too low 70 db dynamic range Longer exposure time 120 db dynamic range lux lux Saturation Exposure time too high t Shorter exposure time t UNIFORM EXPOSURE PER-PIXEL SELF-ADJUSTED EXPOSURE

Pixel exposure impact: No Motion Blur STANDARD CAMERA UNIFORM EXPOSURE SET ON THE SCENE -> HIGH COMPARED TO SPEED ASYNCHRONOUS PER-PIXEL EXPOSURE

Event-based technology enables embedded-ai in objects, devices & machines

Vision trends creating 360 awareness 33% 8% 59% $ 3.7B ASP: $25 140M Units 32% 3% 8% 8% 49% $ 7.1B ASP: $22 320M Units 1% 31% 10% 5% 8% 45% $ 22.2B ASP: <$20 1.1B Units 2-8 cameras 3-10 cameras 6-14 cameras Feet Off Hands Off Eyes Off 2017 2020 2023 2026 2029 Surrounding Forward Blind Spots Forward (Stereo) In-car Backup

Event-based technology as key enabler of low-latency detection and fast classification of obstacles Rethinking ADAS/AD

Respond to any imminent danger in real-time Efficient data acquisition, tailored for machines Runtime inference Ultra-high speed Smaller training datasets Relevant regions of interest Temporal & spatial precision No sensor signal processing Native optical flow Low bandwidth transmission

Provide advanced edge solution E.G. PEDESTRIAN DETECTION LATENCY CCAM (Event-based) Lidar <10ms ~30ms Radar ~10ms CENTRAL UNIT Frame-based 100+ ms TRANSMISSION: (Ethernet/Optical) High Temporal Precision Low Data Rate Reduced Computation High Dynamic Range <10 ms latency for detection First-level classification Robustness to occlusion Generates change events Edge classification Smart Compression Edge processing Relevant regions of interest No ISP required (signal processing) 140+ db Self-adjusted exposure

Introducing unprecedented safety capabilities Radar Lidar Ultrasound Frame-Camera Event-Camera Light Independent Computational Cost Detection Speed Optical Information Passive

Standard Camera (HD 30fps) CCAM (VGA)

Rethinking in-car monitoring At low power with fast, accurate image sensing

3D Tracking

Enhanced driver monitoring E.G. EYE TRACKING & GESTURE CONTROL CCAM (Event-based) Frame-based LATENCY <1ms ~4-8ms TEMPORAL ACCURACY >1kHz <250Hz High Temporal Precision Low Data Rate Low Pow er High Dynamic Range <10 ms latency for detection Faster data fetching No motion blur Graphic rendering efficiency Adjusted image output Edge processing (remove all clutter that does impact downstream processing) Relevant regions of interest 140+ db Self-adjusted exposure

A solid team to lead the shift in technology

Leading the event-based computer vision (r)evolution FIRST PRODUCT CLOSED 15M FUNDRAISING TOP 100 AI STARTUPS COOL VENDOR AI CORE TECHNOLOGY CLOSED 1M FUNDRAISING TOP COMPUTER VISION INNOVATOR 2014 2010 2011 2015 2016 2017 2018 First ATIS Event-based Sensor CHRONOCAM +20 Patents in HW & SW Collaborations Horizon 2020, ANR, RAPID, DARPA CNRS, UPMC, CSIC, U of Illinois, U of Madrid Partnership Renault-Nissan (ADAS systems) Collaborations

Founders & Senior Management Luca Verre Co-founder & CEO Bernard Gilly (PhD) Co-founder & Chairman Atul Sinha Board of Directors & Advisor Jean-Luc Jaffard VP Sensors Engineering & Operations Christoph Posch (PhD) Co-founder & CTO Ryad Benosman (PhD) Co-founder & Advisor Geoff Burns (PhD) VP Products & Vision Systems Stephane Laveau (PhD) VP Computer Vision & Software

40 Employees in Paris 14 Nationalities 36 Average Age 16 PhDs 37 Engineers 38 R&D 4 G&A

THANK YOU! www.chronocam.com