Near-Invariant Blur for Depth and 2D Motion via Time-Varying Light Field Analysis

Similar documents
Coded Computational Photography!

Coded photography , , Computational Photography Fall 2018, Lecture 14

Coded photography , , Computational Photography Fall 2017, Lecture 18

Dappled Photography: Mask Enhanced Cameras for Heterodyned Light Fields and Coded Aperture Refocusing

Deblurring. Basics, Problem definition and variants

Coding and Modulation in Cameras

A Framework for Analysis of Computational Imaging Systems

Transfer Efficiency and Depth Invariance in Computational Cameras

To Denoise or Deblur: Parameter Optimization for Imaging Systems

When Does Computational Imaging Improve Performance?

Computational Camera & Photography: Coded Imaging

An Analysis of Focus Sweep for Improved 2D Motion Invariance

Computational Cameras. Rahul Raguram COMP


Motion-invariant Coding Using a Programmable Aperture Camera

Modeling and Synthesis of Aperture Effects in Cameras

Implementation of Adaptive Coded Aperture Imaging using a Digital Micro-Mirror Device for Defocus Deblurring

Extended Depth of Field Catadioptric Imaging Using Focal Sweep

Deconvolution , , Computational Photography Fall 2017, Lecture 17

To Do. Advanced Computer Graphics. Outline. Computational Imaging. How do we see the world? Pinhole camera

Coded Aperture for Projector and Camera for Robust 3D measurement

Optimal Single Image Capture for Motion Deblurring

Deconvolution , , Computational Photography Fall 2018, Lecture 12

Improved motion invariant imaging with time varying shutter functions

Extended depth of field for visual measurement systems with depth-invariant magnification

The ultimate camera. Computational Photography. Creating the ultimate camera. The ultimate camera. What does it do?

Computational Approaches to Cameras

4D Frequency Analysis of Computational Cameras for Depth of Field Extension

Focal Sweep Videography with Deformable Optics

High resolution extended depth of field microscopy using wavefront coding

Coded Aperture and Coded Exposure Photography

Coded Exposure Deblurring: Optimized Codes for PSF Estimation and Invertibility

Flexible Depth of Field Photography

Flexible Depth of Field Photography

Wavefront coding. Refocusing & Light Fields. Wavefront coding. Final projects. Is depth of field a blur? Frédo Durand Bill Freeman MIT - EECS

Recent Advances in Image Deblurring. Seungyong Lee (Collaboration w/ Sunghyun Cho)

Point Spread Function Engineering for Scene Recovery. Changyin Zhou

A Review over Different Blur Detection Techniques in Image Processing

Sensing Increased Image Resolution Using Aperture Masks

What are Good Apertures for Defocus Deblurring?

Implementation of Image Deblurring Techniques in Java

THE depth of field (DOF) of an imaging system is the

Coded Aperture Pairs for Depth from Defocus

multiframe visual-inertial blur estimation and removal for unmodified smartphones

To Denoise or Deblur: Parameter Optimization for Imaging Systems

Motion Blurred Image Restoration based on Super-resolution Method

fast blur removal for wearable QR code scanners

Raskar, Camera Culture, MIT Media Lab. Ramesh Raskar. Camera Culture. Associate Professor, MIT Media Lab

Computational Photography Introduction

KNOW YOUR CAMERA LEARNING ACTIVITY - WEEK 9

Admin. Lightfields. Overview. Overview 5/13/2008. Idea. Projects due by the end of today. Lecture 13. Lightfield representation of a scene

Lecture 18: Light field cameras. (plenoptic cameras) Visual Computing Systems CMU , Fall 2013

Name: Date: Math in Special Effects: Try Other Challenges. Student Handout

(12) Patent Application Publication (10) Pub. No.: US 2010/ A1 30 DEFOCUS DUETO LENSAPERTURE

Toward Non-stationary Blind Image Deblurring: Models and Techniques

Agenda. Fusion and Reconstruction. Image Fusion & Reconstruction. Image Fusion & Reconstruction. Dr. Yossi Rubner.

Depth from Diffusion

Computational Photography Image Stabilization

SHAW ACADEMY. Lesson 8 Course Notes. Diploma in Photography

A Mathematical model for the determination of distance of an object in a 2D image

CVPR Easter School. Michael S. Brown. School of Computing National University of Singapore

Film Cameras Digital SLR Cameras Point and Shoot Bridge Compact Mirror less

The Formation of an Aerial Image, part 3

Single-Image Shape from Defocus

Optical image stabilization (IS)

Privacy Preserving Optics for Miniature Vision Sensors

Lecture Notes 10 Image Sensor Optics. Imaging optics. Pixel optics. Microlens

Less Is More: Coded Computational Photography

A Recognition of License Plate Images from Fast Moving Vehicles Using Blur Kernel Estimation

Today. Defocus. Deconvolution / inverse filters. MIT 2.71/2.710 Optics 12/12/05 wk15-a-1

Lenses, exposure, and (de)focus

Blur and Recovery with FTVd. By: James Kerwin Zhehao Li Shaoyi Su Charles Park

Coded Exposure HDR Light-Field Video Recording

Optical image stabilization (IS)

Recent advances in deblurring and image stabilization. Michal Šorel Academy of Sciences of the Czech Republic

Restoration of Motion Blurred Document Images

Optics of Wavefront. Austin Roorda, Ph.D. University of Houston College of Optometry

Simulated Programmable Apertures with Lytro

Dappled Photography: Mask Enhanced Cameras for Heterodyned Light Fields and Coded Aperture Refocusing

Single Digital Image Multi-focusing Using Point to Point Blur Model Based Depth Estimation

The Flutter Shutter Camera Simulator

Computational Photography

Tradeoffs and Limits in Computational Imaging. Oliver Cossairt

Analysis of Coded Apertures for Defocus Deblurring of HDR Images

Total Variation Blind Deconvolution: The Devil is in the Details*

2015, IJARCSSE All Rights Reserved Page 312

Correcting for Optical Aberrations using Multilayer Displays

Coded Aperture Flow. Anita Sellent and Paolo Favaro

An Introduction to. Photographic Exposure: Aperture, ISO and Shutter Speed

Burst Photography! EE367/CS448I: Computational Imaging and Display! stanford.edu/class/ee367! Lecture 7! Gordon Wetzstein! Stanford University!

Motion Deblurring from a Single Image using Circular Sensor Motion

Removing Temporal Stationary Blur in Route Panoramas

Reikan FoCal Fully Automatic Test Report

Admin Deblurring & Deconvolution Different types of blur

Mastering Y our Your Digital Camera

Focal Sweep Imaging with Multi-focal Diffractive Optics

Module 3: Video Sampling Lecture 18: Filtering operations in Camera and display devices. The Lecture Contains: Effect of Temporal Aperture:

Defocus Map Estimation from a Single Image

How do we see the world?

SUPER RESOLUTION INTRODUCTION

Transcription:

Near-Invariant Blur for Depth and 2D Motion via Time-Varying Light Field Analysis Yosuke Bando 1,2 Henry Holtzman 2 Ramesh Raskar 2 1 Toshiba Corporation 2 MIT Media Lab

Defocus & Motion Blur PSF

Depth and Motion-Invariant Capture PSF

Deblurring Result

Outline Motivation Related Work Intuitions Analysis Results Conclusions

Outline Motivation Related Work Intuitions Analysis Results Conclusions

Joint Defocus & Motion Deblurring Standard approach Image capture Local blur estimation Non-uniform deblurring Extremely difficult Estimate depth and motion from a single image Recover lost high-frequency content

Joint Defocus & Motion Deblurring Standard approach Image capture Local blur estimation Non-uniform deblurring Depth and 2D motioninvariant image capture Proposed approach No blur estimation Uniform deconvolution Well-studied problem

Outline Motivation Related Work Intuitions Analysis Results Conclusions

Depth-Invariant Capture Wavefront coding [Dowski and Cathey 1995] Focus sweep [Hausler 1972; Nagahara et al. 2008] Depth-invariant image Diffusion coding [Cossairt et al. 2010] Spectral focus sweep [Cossairt and Nayar 2010] Deblurred

1D Motion-Invariant Capture Invariant to object speed Motion direction must be fixed Horizontal, for example Image sensor accelerate [Levin et al. 2008] Normal camera Motion-invariant image Deblurred

Computational Cameras for Deblurring High frequency preservation (non-invariant) Invariant capture Defocus deblurring Coded aperture [Levin et al. 2007; Veeraraghavan et al. 2007] Lattice-focal lens [Levin et al. 2009] Motion deblurring No joint defocus and motion deblurring No 2D motion-invariant capture Wavefront coding [Dowski and Cathey 1995] Focus sweep [Hausler 1972; Nagahara et al. 2008] Diffusion coding [Cossairt et al. 2010] Spectral focus sweep [Cossairt and Nayar 2010] Coded exposure [Raskar et al. 2006] Orthogonal parabolic exposures [Cho et al. 2010] Circular sensor motion [Bando et al. 2011] Motion-invariant photography (for 1D motion) [Levin et al. 2008] Also nearly 2D motion-invariant

Outline Motivation Related Work Intuitions Analysis Results Conclusions

Depth-Invariance for Static Point Scene point Plane of focus Aperture Sensor Time

Depth-Invariance for Static Point Scene point Plane of focus Aperture Sensor Time

Depth-Invariance for Static Point Scene point Plane of focus Aperture Sensor Time

Motion-Invariance for Moving Point Scene point Aperture Sensor Motion Time

Follow Shot http://commons.wikimedia.org/wiki/file:bruno_senna_2006_australian_grand_prix-3.jpg

Follow Shots for Various Motions t x y

Follow Shots for Various Motions t x y

Follow Shots for Various Motions t x y

Outline Motivation Related Work Intuitions Analysis Results Conclusions

Analysis Photo is a projection of a light field [Ng 2005] D x 0 = k x 0 x, u l x, u dxdu Defocus-blurred image Light field kernel A Light field v Aperture Sensor y x = (x, y) u = (u, v) b Scene point u x

Analysis Photo is a projection of a time-varying light field D x 0 = k x 0 x, u l x, u dxdu Defocus-blurred image Velocity (m x, m y ) Light field kernel A Light field v Aperture Sensor y x = (x, y) u = (u, v) b Scene point u x

Analysis Photo is a projection of a time-varying light field D x 0 = k x 0 x, u, t l x, u, t dxdudt x = (x, y) Defocus/motionblurred image Velocity (m x, m y ) Time-varying light field kernel A v Time-varying light field Aperture Sensor y u = (u, v) b Scene point u x

Time-Varying Light Field Analysis Photo is a projection of a time-varying light field D x 0 = k x 0 x, u, t l x, u, t dxdudt x = (x, y) Defocus/motionblurred image Time-varying light field kernel Time-varying light field φ s,m x = k x + su + mt, u, t dudt u = (u, v) Lambertian scene at depth s with velocity m = (m x, m y ) Joint defocus & motion blur PSF Magnitude of 2D Fourier transform φ s,m f x 2 = k f x, sf x, m f x 2 Modulation transfer function (MTF)

Analysis Procedure and Findings For each existing computational camera for deblurring 1. derive a kernel equation describing the optical system 2. calculate its Fourier transform to obtain the MTF 3. compare it with the theoretical upper bounds 58% 66% Better than any other existing computational cameras for deblurring

Outline Motivation Related Work Intuitions Analysis Results Conclusions

Prototype Focus Sweep Camera

Prototype Camera & Setup Hot shoe Shutter release signal Reference camera Scene SPI command to move focus Arduino + batteries Beam splitter Focus sweep camera

Normal Camera Image Defocused Focused Motion blur Motion blur

Focus Sweep Image

Deconvolution Result

Short Exposure Narrow Aperture Image

More Examples Motion Focus N/A Standard camera Focus sweep Deconvolution results

Limitations Object depth and speed ranges must be bounded Depth and speed ranges cannot be adjusted separately Object motion must be in-plane linear Camera shake cannot be handled Standard camera Focus sweep Deconvolved

Rotation & Z Motion Focus Motion Motion Focus Standard camera Focus sweep Deconvolution results

Summary Simple approach to joint defocus & motion deblurring No need for estimating scene depth or motion Also preserves high-frequency image content Theoretically near-optimal Has practical implementation (just firmware update) Standard camera Focus sweep Deconvolution results

Summary Simple joint defocus & motion deblurring No depth or motion estimation Preserves high-frequency Theoretically near-optimal Practical implementation http://www.media.mit.edu/~bandy/invariant/ How to control the lens How to achieve perfect invariance Computational Cameras & Displays 2013 Acknowledgments Yusuke Iguchi Noriko Kurachi Matthew Hirsch, Matthew O Toole Douglas Lanman Cheryl Sham Sonia Chang Shih-Yu Sun Jeffrey W. Kaeli Bridger Maxwell Austin S. Lee Saori Bando