Time of Flight Capture

Similar documents
Active Stereo Vision. COMP 4102A Winter 2014 Gerhard Roth Version 1

Development of intelligent systems

Range Sensing strategies

Coded Aperture for Projector and Camera for Robust 3D measurement

Lecture 19: Depth Cameras. Kayvon Fatahalian CMU : Graphics and Imaging Architectures (Fall 2011)

Structured-Light Based Acquisition (Part 1)

Lab 2. Logistics & Travel. Installing all the packages. Makeup class Recorded class Class time to work on lab Remote class

MOBILE ROBOTICS. Sensors An Introduction

Wireless Localization Techniques CS441

EEE 187: Robotics. Summary 11: Sensors used in Robotics

Coded Computational Photography!

Input devices and interaction. Ruth Aylett

By Pierre Olivier, Vice President, Engineering and Manufacturing, LeddarTech Inc.

Augmented and Virtual Reality

Intelligent Robotics Sensors and Actuators

Intro to Virtual Reality (Cont)

Sensing. Autonomous systems. Properties. Classification. Key requirement of autonomous systems. An AS should be connected to the outside world.

echo-based range sensing L06Ua echo-based range sensing 1

Probabilistic Robotics Course. Robots and Sensors Orazio

Exercise questions for Machine vision

Brainstorm. In addition to cameras / Kinect, what other kinds of sensors would be useful?

CENG 5931 HW 5 Mobile Robotics Due March 5. Sensors for Mobile Robots

Technical Explanation for Displacement Sensors and Measurement Sensors

Introduction to Virtual Reality (based on a talk by Bill Mark)

Deblurring. Basics, Problem definition and variants

Computational Photography: Illumination Part 2. Brown 1

Synthetic aperture photography and illumination using arrays of cameras and projectors

CS6670: Computer Vision

Cameras As Computing Systems

ME 6406 MACHINE VISION. Georgia Institute of Technology

FLASH LiDAR KEY BENEFITS

Image Acquisition. Jos J.M. Groote Schaarsberg Center for Image Processing

3-D Imaging of Partly Concealed Targets by Laser Radar

Sensing and Perception

Classifying 3D Input Devices

Classifying 3D Input Devices

REAL TIME SURFACE DEFORMATIONS MONITORING DURING LASER PROCESSING

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright

Congress Best Paper Award

Image Restoration and Super- Resolution

MEM380 Applied Autonomous Robots I Fall Introduction to Sensors & Perception

An Example of robots with their sensors

Sensors and Actuators

On the Recovery of Depth from a Single Defocused Image

FOCAL LENGTH CHANGE COMPENSATION FOR MONOCULAR SLAM

Indoor Positioning by the Fusion of Wireless Metrics and Sensors

"Internet Telescope" Performance Requirements

1393 DISPLACEMENT SENSORS

Recent Advances in Image Deblurring. Seungyong Lee (Collaboration w/ Sunghyun Cho)

SHAPE FROM FOCUS. Keywords defocus, focus operator, focus measure function, depth estimation, roughness and tecture, automatic shapefromfocus.

An Example of robots with their sensors

LENSES. INEL 6088 Computer Vision

CS545 Contents XIV. Components of a Robotic System. Signal Processing. Reading Assignment for Next Class

Fig 1. Statistical Report for death by road user category

Introduction to Arduino HW Labs

Controlling vehicle functions with natural body language

TRIANGULATION-BASED light projection is a typical

Applying Automated Optical Inspection Ben Dawson, DALSA Coreco Inc., ipd Group (987)

Robot Sensors Introduction to Robotics Lecture Handout September 20, H. Harry Asada Massachusetts Institute of Technology

Advances in slabs defects inspection with Conoscopic Holography. Ignacio Alvarez, J.M. Enguita (UniOvi) J. Marina, R.

High-speed Micro-crack Detection of Solar Wafers with Variable Thickness

Sonar imaging of structured sparse scene using template compressed sensing

Tracking. Alireza Bahmanpour, Emma Byrne, Jozef Doboš, Victor Mendoza and Pan Ye

16. Sensors 217. eye hand control. br-er16-01e.cdr

Chapter 2: Digital Image Fundamentals. Digital image processing is based on. Mathematical and probabilistic models Human intuition and analysis

Sensor system of a small biped entertainment robot

PERFORMANCE EVALUATIONS OF MACRO LENSES FOR DIGITAL DOCUMENTATION OF SMALL OBJECTS

Implementation of Adaptive Coded Aperture Imaging using a Digital Micro-Mirror Device for Defocus Deblurring

GUI and Gestures. CS334 Fall Daniel G. Aliaga Department of Computer Science Purdue University

A Mathematical model for the determination of distance of an object in a 2D image

EL6483: Sensors and Actuators

Computational Photography Introduction

Coded photography , , Computational Photography Fall 2018, Lecture 14

USING PSEUDO-RANDOM CODES FOR MOBILE ROBOT SONAR SENSING. Klaus-Werner Jšrg, Markus Berg & Markus MŸller

Changyin Zhou. Ph.D, Computer Science, Columbia University Oct 2012

DECISION NUMBER FOURTEEN TO THE TREATY ON OPEN SKIES

Toward an Augmented Reality System for Violin Learning Support

Usage of optical fiber in metrology Applications in Hexagon products

Supplementary Material of

Privacy Preserving Optics for Miniature Vision Sensors

IMAGE PROCESSING PAPER PRESENTATION ON IMAGE PROCESSING

Hochperformante Inline-3D-Messung

Multi-Resolution Estimation of Optical Flow on Vehicle Tracking under Unpredictable Environments

Face Detection using 3-D Time-of-Flight and Colour Cameras

Principles of Pulse-Doppler Radar p. 1 Types of Doppler Radar p. 1 Definitions p. 5 Doppler Shift p. 5 Translation to Zero Intermediate Frequency p.

Instrumentation (ch. 4 in Lecture notes)

Robot Hardware Non-visual Sensors. Ioannis Rekleitis

The Performance Improvement of a Linear CCD Sensor Using an Automatic Threshold Control Algorithm for Displacement Measurement

Revolutionizing 2D measurement. Maximizing longevity. Challenging expectations. R2100 Multi-Ray LED Scanner

Vision Lighting Seminar

Computational Sensors

Standard Operating Procedure for Flat Port Camera Calibration

ACEEE Int. J. on Electrical and Power Engineering, Vol. 03, No. 02, May 2012

Optical Flow Estimation. Using High Frame Rate Sequences

CSE 165: 3D User Interaction. Lecture #7: Input Devices Part 2

Coded photography , , Computational Photography Fall 2017, Lecture 18

Criteria for Optical Systems: Optical Path Difference How do we determine the quality of a lens system? Several criteria used in optical design

SEMICONDUCTOR LASER PROPERTY TO FORM INTERRUPTING RADIATION AT THE MOMENT OF SWITCHING ON AND SWITCHING OFF OF THE PUMPING ELECTRIC CURRENT

Photon Diagnostics. FLASH User Workshop 08.

Hyperspectral Systems: Recent Developments and Low Cost Sensors. 56th Photogrammetric Week in Stuttgart, September 11 to September 15, 2017

Transcription:

Time of Flight Capture CS635 Spring 2017 Daniel G. Aliaga Department of Computer Science Purdue University

Range Acquisition Taxonomy Range acquisition Contact Transmissive Mechanical (CMM, jointed arm) Inertial (gyroscope, accelerom. Ultrasonic trackers Magnetic trackers Industrial CT Ultrasound MRI Reflective Non-optical Optical Radar Sonar

Range Acquisition Taxonomy Optical methods Passive Active Shape from X: stereo motion shading texture focus defocus Active variants of passive method Structured Light Active depth from defocus Photometric stereo Triangulation (e.g., lasers) Time of flight

Optical Range Scanning Methods Advantages: Non-contact Safe Usually inexpensive Usually fast Disadvantages: Sensitive to transparency Confused by specularity and interreflection Texture (helps some methods, hurts others)

Stereo Find feature in one image, search along epipole in other image for correspondence

Stereo Advantages: Passive Cheap hardware (2 cameras) Easy to accommodate motion Intuitive analogue to human vision Disadvantages: Only acquire good data at features Sparse, relatively noisy data (correspondence is hard) Bad around silhouettes Confused by non-diffuse surfaces Variant: multibaseline stereo to reduce ambiguity

Shape from Motion Limiting case of multibaseline stereo Track a feature in a video sequence For n frames and f features, have 2 n f knowns, 6 n+3 f unknowns

Shape from Motion Advantages: Feature tracking easier than correspondence in far-away views Mathematically more stable (large baseline) Disadvantages: Does not accommodate object motion Still problems in areas of low texture, in nondiffuse regions, and around silhouettes

Shape from Shading Given: image of surface with known, constant reflectance under known point light Estimate normals, integrate to find surface Problem: ambiguity

Shape from Shading Advantages: Single image No correspondences Analogue in human vision Disadvantages: Mathematically unstable Can t have texture Not really practical But see photometric stereo

Shape from Texture Mathematically similar to shape from shading, but uses stretch and shrink of a (regular) texture

Shape from Texture Analogue to human vision Same disadvantages as shape from shading

Shape from Focus and Defocus Shape from focus: at which focus setting is a given image region sharpest? Shape from defocus: how out-of-focus is each image region? Passive versions rarely used Active depth from defocus can be made practical

Active Optical Methods Advantages: Usually can get dense data Usually much more robust and accurate than passive techniques Disadvantages: Introduces light into scene (distracting, etc.) Not motivated by human vision

Active Variants of Passive Techniques Regular stereo with projected texture (=Structured Light) Provides features for correspondence Active depth from defocus Known pattern helps to estimate defocus Photometric stereo Shape from shading with multiple lights

Time of Flight A time-of-flight (TOF) camera works by illuminating the scene with a modulated light source, and observing the reflected light. The phase shift between the illumination and the reflection is measured and translated to distance Not new: A. Gruss et al., Integrated sensor and range-finding analog signal processor, IEEE J. Solid-State Circuit, 1991 Miyagawa, R., Kanade, T., CCD-Based Range Finding Sensor, IEEE Transactions on Electron Devices, 1997

Time of Flight But being rediscovered and enabled by advances in hardware (since ~2000) e.g., Swiss Ranger, ZCam, Canesta, Kinect (=ZCam+Canesta+MSFT$$$) Kadambi et al., Coded Time of Flight Cameras: Sparse Deconvolution to Resolve Multipath Interference, ACM SIGGRAPH Asia, 2013 Often uses ~850nm light (so not visible to humans)

Pulsed Time of Flight

Pulsed Time of Flight Advantages: Large working volume (up to 100 m.) Disadvantages: Not-so-great accuracy (at best ~5 mm.) Requires getting timing to ~30 picoseconds Does not scale with working volume Often used for scanning buildings, rooms, archeological sites, etc.

Pulsed Time of Flight Send square waves Easier to produce with digital circuits Start a counter to measure time delay Achieving 1mm accuracy needs a pulse of 6.6 picoseconds in duration Most possible, but still hard, is 5mm accuracy needing about 30 picoseconds

Pulsed Time of Flight How to measure the time it took the reflection to get back with photo-detectors? How to convert to distance?

Pulsed Time of Flight C1 window corresponding to light source C2 window corresponding to!c1

Pulsed Time of Flight

Continuous Wave ToF

Continuous Wave ToF

Continuous Wave ToF

Continuous Wave ToF

Continuous Wave ToF

Continuous Wave ToF

Continuous Wave ToF

Continuous Wave ToF