Modeling and Synthesis of Aperture Effects in Cameras

Similar documents
Modeling and Synthesis of Aperture Effects in Cameras

Dappled Photography: Mask Enhanced Cameras for Heterodyned Light Fields and Coded Aperture Refocusing

IMAGE FORMATION. Light source properties. Sensor characteristics Surface. Surface reflectance properties. Optics

Near-Invariant Blur for Depth and 2D Motion via Time-Varying Light Field Analysis

Computational Cameras. Rahul Raguram COMP

A Framework for Analysis of Computational Imaging Systems

Cameras. Steve Rotenberg CSE168: Rendering Algorithms UCSD, Spring 2017

Lenses, exposure, and (de)focus

Coding and Modulation in Cameras

Coded photography , , Computational Photography Fall 2018, Lecture 14

Computational Camera & Photography: Coded Imaging

Unit 1: Image Formation

Computational Photography and Video. Prof. Marc Pollefeys

Coded photography , , Computational Photography Fall 2017, Lecture 18

Coded Exposure HDR Light-Field Video Recording

Aperture and Digi scoping. Thoughts on the value of the aperture of a scope digital camera combination.

Burst Photography! EE367/CS448I: Computational Imaging and Display! stanford.edu/class/ee367! Lecture 7! Gordon Wetzstein! Stanford University!

Coded Aperture for Projector and Camera for Robust 3D measurement

When Does Computational Imaging Improve Performance?

Single-view Metrology and Cameras

On Cosine-fourth and Vignetting Effects in Real Lenses*

Sensing Increased Image Resolution Using Aperture Masks

Implementation of Adaptive Coded Aperture Imaging using a Digital Micro-Mirror Device for Defocus Deblurring

LENSES. INEL 6088 Computer Vision

Deconvolution , , Computational Photography Fall 2017, Lecture 17

Acquisition. Some slides from: Yung-Yu Chuang (DigiVfx) Jan Neumann, Pat Hanrahan, Alexei Efros

Toward Non-stationary Blind Image Deblurring: Models and Techniques

VC 14/15 TP2 Image Formation

Laboratory experiment aberrations

Midterm Examination CS 534: Computational Photography

BROADCAST ENGINEERING 5/05 WHITE PAPER TUTORIAL. HEADLINE: HDTV Lens Design: Management of Light Transmission

Sequential Algorithm for Robust Radiometric Calibration and Vignetting Correction

Coded Computational Photography!

( ) Deriving the Lens Transmittance Function. Thin lens transmission is given by a phase with unit magnitude.

Coded Aperture and Coded Exposure Photography

ME 6406 MACHINE VISION. Georgia Institute of Technology

Opto Engineering S.r.l.

6.003: Signal Processing. Synthetic Aperture Optics

Some of the important topics needed to be addressed in a successful lens design project (R.R. Shannon: The Art and Science of Optical Design)

ECEN 4606, UNDERGRADUATE OPTICS LAB

Lecture Notes 10 Image Sensor Optics. Imaging optics. Pixel optics. Microlens

MIT CSAIL Advances in Computer Vision Fall Problem Set 6: Anaglyph Camera Obscura

Deconvolution , , Computational Photography Fall 2018, Lecture 12

Image Formation. Dr. Gerhard Roth. COMP 4102A Winter 2015 Version 3

To Denoise or Deblur: Parameter Optimization for Imaging Systems

Image Formation and Capture. Acknowledgment: some figures by B. Curless, E. Hecht, W.J. Smith, B.K.P. Horn, and A. Theuwissen

Transfer Efficiency and Depth Invariance in Computational Cameras

To Do. Advanced Computer Graphics. Outline. Computational Imaging. How do we see the world? Pinhole camera

Image Formation. Dr. Gerhard Roth. COMP 4102A Winter 2014 Version 1

Computer Vision. The Pinhole Camera Model

Camera Resolution and Distortion: Advanced Edge Fitting

OPTICAL IMAGING AND ABERRATIONS

Photo-Consistent Motion Blur Modeling for Realistic Image Synthesis

Colorado School of Mines. Computer Vision. Professor William Hoff Dept of Electrical Engineering &Computer Science.

Deblurring. Basics, Problem definition and variants


Computational Photography Introduction

Samuel William Hasinoff

Chapter 18 Optical Elements

VC 11/12 T2 Image Formation

Acquisition Basics. How can we measure material properties? Goal of this Section. Special Purpose Tools. General Purpose Tools

A Mathematical model for the determination of distance of an object in a 2D image

VC 16/17 TP2 Image Formation

Cameras. CSE 455, Winter 2010 January 25, 2010

Announcement A total of 5 (five) late days are allowed for projects. Office hours

Computer Vision. Howie Choset Introduction to Robotics

ONE OF THE MOST IMPORTANT SETTINGS ON YOUR CAMERA!

Simulated Programmable Apertures with Lytro

NTU CSIE. Advisor: Wu Ja Ling, Ph.D.

6.098 Digital and Computational Photography Advanced Computational Photography. Bill Freeman Frédo Durand MIT - EECS

6.A44 Computational Photography

High dynamic range imaging and tonemapping

Projection. Readings. Szeliski 2.1. Wednesday, October 23, 13

Extended depth-of-field in Integral Imaging by depth-dependent deconvolution

Speed and Image Brightness uniformity of telecentric lenses

High Dynamic Range Displays

Computational Approaches to Cameras

Determining MTF with a Slant Edge Target ABSTRACT AND INTRODUCTION

Modeling the calibration pipeline of the Lytro camera for high quality light-field image reconstruction

Physics 3340 Spring Fourier Optics

Point Spread Function. Confocal Laser Scanning Microscopy. Confocal Aperture. Optical aberrations. Alternative Scanning Microscopy

GEOMETRICAL OPTICS AND OPTICAL DESIGN

SECTION I - CHAPTER 2 DIGITAL IMAGING PROCESSING CONCEPTS

Wavefront coding. Refocusing & Light Fields. Wavefront coding. Final projects. Is depth of field a blur? Frédo Durand Bill Freeman MIT - EECS

Image Formation and Capture

Computational Photography

A Poorly Focused Talk

Lecture 18: Light field cameras. (plenoptic cameras) Visual Computing Systems CMU , Fall 2013

Imaging Instruments (part I)

Overview. Pinhole camera model Projective geometry Vanishing points and lines Projection matrix Cameras with Lenses Color Digital image

Lensless Imaging with a Controllable Aperture

Optimal Camera Parameters for Depth from Defocus

What are Good Apertures for Defocus Deblurring?

Lecture 22: Cameras & Lenses III. Computer Graphics and Imaging UC Berkeley CS184/284A, Spring 2017

High Performance Imaging Using Large Camera Arrays

4 STUDY OF DEBLURRING TECHNIQUES FOR RESTORED MOTION BLURRED IMAGES

Basic principles of photography. David Capel 346B IST

Exposure settings & Lens choices

CS354 Computer Graphics Computational Photography. Qixing Huang April 23 th 2018

Dynamic Optically Multiplexed Imaging

Transcription:

Modeling and Synthesis of Aperture Effects in Cameras Douglas Lanman, Ramesh Raskar, and Gabriel Taubin Computational Aesthetics 2008 20 June, 2008 1

Outline Introduction and Related Work Modeling Vignetting Synthesizing Vignetting Experimental Results Conclusions and Future Work Modeling and Synthesis of Aperture Effects in Cameras 2

Motivation Aperture Effects in Cameras f/32 f/5.6 While capturing all-in-focus images, small apertures (pinholes) are impractical due to limited exposures required by lighting or motion For larger apertures, depth of field effects are observed, including: (1) spatially-varying blur depending on depth, and (2) vignetting We present simple approaches to model and control these effects Modeling and Synthesis of Aperture Effects in Cameras 3

Related Work Radiometric Camera Calibration Source of non-idealities [Litvinov '05] Flat-field vignetting calib. [Yu 04] Single-image calib. [Zheng '06] Image mosaics [D Angelo '07] Coded-Aperture Imaging Useful for deblurring [Levin '07] Can be applied to capture light field photographs [Veeraraghavan '07] Variable-Aperture Photography Aperture bracketing [Hasinoff '06] Confocal stereo [Hasinoff '07] Modeling and Synthesis of Aperture Effects in Cameras 4

Outline Introduction and Related Work Modeling Vignetting Synthesizing Vignetting Experimental Results Conclusions and Future Work Modeling and Synthesis of Aperture Effects in Cameras 5

Sources of Vignetting Mechanical Vignetting: Due to physical obstructions (lens hoods, filters, etc.) Can completely block light from reaching certain regions Optical Vignetting: Occurs in multi-element optical designs, due to decrease in clear area (exit pupil) off-axis Reduced using small apertures Natural and Pixel Vignetting: Combines physics effects: (inverse-square fall-off of light, Lambert s law, foreshortening) Occlusion due to pixel depth Modeling and Synthesis of Aperture Effects in Cameras 6

Geometric Model: Spatially-varying PSF The distance D to the in-focus object plane for a thin lens is given by: The image of an out-of-focus point at S will be a blurred region of width c, where: c This model predicts that the PSF will scale according to the object distance S and the f-number N requiring a calibration procedure to sample both parameters. S D f f D However, as noted by Hasinoff and Kutulakos, the effective blur diameter c~ is given by the following linear relation. For this approximation, the spatiallyvarying PSF can be estimated from a single image, and is given by: Modeling and Synthesis of Aperture Effects in Cameras 7

Photometric Model: Radial Intensity Fall-off Photometric Vignetting Model original image vignetting-corrected As we have shown, various sources of vignetting result in a radial reduction in brightness that increases towards the image periphery Can be modeled as a low-order polynomial surface For known zoom, focal length, and aperture, traditional solution is to divide image intensity by a flat-field calibration image taken with a uniform white light area source (e.g., a light box) Unfortunately, this approach cannot simultaneously estimate the PSF Modeling and Synthesis of Aperture Effects in Cameras 8

Experimental Vignetting Calibration Calibration using Point Sources To obtain simultaneous estimates of the spatially-varying PSF and intensity fall-off, we use a point source array for calibration We observe that the image of a point light directly gives PSF (i.e., impulse response) Sparse PSFs dense (i.e., per-pixel) PSFs using triangle-based interpolation calibration pattern barycentric interpolated Delaunay sparse intensity query blur position triangulation centroids coordinates blur kernels star-shaped aperture image open aperture image Modeling and Synthesis of Aperture Effects in Cameras 9

Experimental Vignetting Calibration Note: Gray kernels are images of point lights, red is linearly-interpolated Modeling and Synthesis of Aperture Effects in Cameras 10

Outline Introduction and Related Work Modeling Vignetting Synthesizing Vignetting Experimental Results Conclusions and Future Work Modeling and Synthesis of Aperture Effects in Cameras 11

Vignetting Synthesis: The Bokeh Brush Goal: We desire the ability to control the spatially-varying PSF post-capture Since the spatially-varying PSF is related to the shape of out-of-focus points, this goal is equivalent to controlling bokeh We will develop a Bokeh Brush Observation: PSF and aperture are closely-connected examples of bokeh calibration object open circular aperture star-shaped aperture Modeling and Synthesis of Aperture Effects in Cameras 12

Vignetting Synthesis: Superposition Principle Superposition Principle For unit magnification, the recorded image irradiance I i (x,y) at a pixel (x,y) is where Ω is the domain of the image, I o (x,y) is the object plane irradiance, and B(s,t;x,y) is the spatially-varying PSF We can express the PSF using N basis functions {B i (s,t;x,y)}, such that Since the PSF and aperture are directly related, we collect a sequence of basis images using the apertures {A i (s,t;x,y)} Basis images can be linearly-combined to synthesize the image for any aperture, so long as the basis is a good approximation Modeling and Synthesis of Aperture Effects in Cameras 13

Aperture Superposition: Lens Modifications Canon EOS Digital Rebel XT with a Canon EF 100mm 1:1.28 Macro Lens Modified to allow manual insertion of aperture patterns directly into the plane of the iris (i.e., by removing the original lens diaphragm) Modeling and Synthesis of Aperture Effects in Cameras 14

Aperture Superposition: Laboratory Results Note: Synthesized 7-segment images using aperture superposition Modeling and Synthesis of Aperture Effects in Cameras 15

Bokeh Synthesis using PCA Applying Principal Components Analysis: Although we have shown that images with different apertures can be linearly-combined, we still require an efficient basis One solution is to use a set of translated pinholes (equivalent to recording the incident light field) But, specialized bases can be used to achieve greater compression ratios Here we develop a positive-valued PCA basis training apertures basis apertures from PCA reconstruction results Modeling and Synthesis of Aperture Effects in Cameras 16

Bokeh Synthesis using NMF Applying Non-negative Matrix Factorization: Rather than normalizing PCA basis, we can find non-negative apertures directly using NMF NMF developed by Lee and Seung [1999] and involves iteratively-solving for positive basis NMF eliminates open and bias apertures used by PCA, reducing total number of apertures Unfortunately, NMF basis is not unique training apertures basis apertures from NMF reconstruction results Modeling and Synthesis of Aperture Effects in Cameras 17

Outline Introduction and Related Work Modeling Vignetting Synthesizing Vignetting Experimental Results Conclusions and Future Work Modeling and Synthesis of Aperture Effects in Cameras 18

Spatially-varying Deblurring original image uniformly-defocused image deblurred (mean PSF) deblurred (spatially-varying PSF) Example of simulated spatially-varying blur (i.e., invariant to scene depth) Deblurred with estimate of spatially-varying PSF from proposed method Modeling and Synthesis of Aperture Effects in Cameras 19

Vignetting Synthesis: Simulated Results simulated PCA basis images synthesized apertures Example of Bokeh Brush post-capture stylization, where the aperture function has been re-synthesized to represent each letter of the capitalized Arial font using a PCA-derived basis set Modeling and Synthesis of Aperture Effects in Cameras 20

Vignetting Synthesis: Simulated Results original HDR image first PCA basis aperture bokeh stylization Example of Bokeh Brush post-capture stylization, where the aperture function has been re-synthesized in a spatially-varying manner to read BOKEH along the left wall from a PCA basis set Modeling and Synthesis of Aperture Effects in Cameras 21

Outline Introduction and Related Work Modeling Vignetting Synthesizing Vignetting Experimental Results Conclusions and Future Work Modeling and Synthesis of Aperture Effects in Cameras 22

Conclusions and Future Work Contributions Applied the fact that the out-of-focus image of a point light directly gives the point spread function leading to a practical, low-cost method to simultaneously estimate vignetting and spatially-varying point spread function Introduced the Bokeh Brush: a novel, post-capture method for full-resolution control of the shape of out-of-focus points achieved using a small set of images with varying basis aperture shapes Limitations of the Calibration Procedure Point sources require long exposures and only provide sparse PSFs The point light source array is assumed to be uniform, but LCDs can vary Limitation of the Bokeh Brush Can only reconstruct apertures well-approximated by chosen basis Achieves only modest compression ratios for included examples Future Work Apply Bokeh Brush directly to light field photographs (i.e., pinhole basis set) Modeling and Synthesis of Aperture Effects in Cameras 23