Improving Film-Like Photography. aka, Epsilon Photography

Similar documents
Wavelengths and Colors. Ankit Mohan MAS.131/531 Fall 2009

Coding and Modulation in Cameras

The Dynamic Range Problem. High Dynamic Range (HDR) Multiple Exposure Photography. Multiple Exposure Photography. Dr. Yossi Rubner.

Dappled Photography: Mask Enhanced Cameras for Heterodyned Light Fields and Coded Aperture Refocusing

Synthetic aperture photography and illumination using arrays of cameras and projectors

Cameras. Shrinking the aperture. Camera trial #1. Pinhole camera. Digital Visual Effects Yung-Yu Chuang. Put a piece of film in front of an object.

The ultimate camera. Computational Photography. Creating the ultimate camera. The ultimate camera. What does it do?

Burst Photography! EE367/CS448I: Computational Imaging and Display! stanford.edu/class/ee367! Lecture 7! Gordon Wetzstein! Stanford University!

Announcement A total of 5 (five) late days are allowed for projects. Office hours

Capturing Light. The Light Field. Grayscale Snapshot 12/1/16. P(q, f)

Computational Photography and Video. Prof. Marc Pollefeys

Coded Computational Photography!

High Dynamic Range Imaging: Spatially Varying Pixel Exposures Λ

Light field sensing. Marc Levoy. Computer Science Department Stanford University

Cameras. Digital Visual Effects, Spring 2008 Yung-Yu Chuang 2008/2/26. with slides by Fredo Durand, Brian Curless, Steve Seitz and Alexei Efros

Implementation of Adaptive Coded Aperture Imaging using a Digital Micro-Mirror Device for Defocus Deblurring

Introduction to Light Fields

CS559: Computer Graphics. Lecture 2: Image Formation in Eyes and Cameras Li Zhang Spring 2008

Lenses, exposure, and (de)focus

Cameras. Outline. Pinhole camera. Camera trial #1. Pinhole camera Film camera Digital camera Video camera High dynamic range imaging

Cameras. Outline. Pinhole camera. Camera trial #1. Pinhole camera Film camera Digital camera Video camera

Computational Camera & Photography: Coded Imaging

Lecture 22: Cameras & Lenses III. Computer Graphics and Imaging UC Berkeley CS184/284A, Spring 2017

Lecture 18: Light field cameras. (plenoptic cameras) Visual Computing Systems CMU , Fall 2013

High dynamic range imaging and tonemapping

CS6670: Computer Vision

VC 11/12 T2 Image Formation

LENSES. INEL 6088 Computer Vision

Less Is More: Coded Computational Photography

Computational Approaches to Cameras

Image Formation and Camera Design

Realistic Image Synthesis

Wavefront coding. Refocusing & Light Fields. Wavefront coding. Final projects. Is depth of field a blur? Frédo Durand Bill Freeman MIT - EECS

VC 14/15 TP2 Image Formation

Coded Aperture for Projector and Camera for Robust 3D measurement

Coded Aperture and Coded Exposure Photography

Computational Cameras. Rahul Raguram COMP

Design of Temporally Dithered Codes for Increased Depth of Field in Structured Light Systems

To Do. Advanced Computer Graphics. Outline. Computational Imaging. How do we see the world? Pinhole camera

Image Formation. Dr. Gerhard Roth. COMP 4102A Winter 2015 Version 3

Building a Real Camera. Slides Credit: Svetlana Lazebnik

When Does Computational Imaging Improve Performance?

Unit 1: Image Formation

Cameras As Computing Systems

Computational Photography Introduction

IMAGE FORMATION. Light source properties. Sensor characteristics Surface. Surface reflectance properties. Optics

6.098 Digital and Computational Photography Advanced Computational Photography. Bill Freeman Frédo Durand MIT - EECS

Cameras. Steve Rotenberg CSE168: Rendering Algorithms UCSD, Spring 2017

Sensing Increased Image Resolution Using Aperture Masks

Coded photography , , Computational Photography Fall 2018, Lecture 14

VC 16/17 TP2 Image Formation

High Resolution Spectral Video Capture & Computational Photography Xun Cao ( 曹汛 )

Modeling and Synthesis of Aperture Effects in Cameras

Acquisition. Some slides from: Yung-Yu Chuang (DigiVfx) Jan Neumann, Pat Hanrahan, Alexei Efros

Admin. Lightfields. Overview. Overview 5/13/2008. Idea. Projects due by the end of today. Lecture 13. Lightfield representation of a scene

Computational Photography

Sensors and Sensing Cameras and Camera Calibration

Cameras. CSE 455, Winter 2010 January 25, 2010

TSBB09 Image Sensors 2018-HT2. Image Formation Part 1

Raskar, Camera Culture, MIT Media Lab. Ramesh Raskar. Camera Culture. Associate Professor, MIT Media Lab

Computational Photography: Principles and Practice

Basic principles of photography. David Capel 346B IST

Removal of Glare Caused by Water Droplets

Radiometric alignment and vignetting calibration

Light field photography and microscopy

Coded photography , , Computational Photography Fall 2017, Lecture 18

Applications of Optics

Building a Real Camera

Prof. Feng Liu. Spring /05/2017

Why learn about photography in this course?

What will be on the midterm?

Deconvolution , , Computational Photography Fall 2018, Lecture 12

lecture 24 image capture - photography: model of image formation - image blur - camera settings (f-number, shutter speed) - exposure - camera response

CAMERA BASICS. Stops of light

9/19/16. A Closer Look. Danae Wolfe. What We ll Cover. Basics of photography & your camera. Technical. Macro & close-up techniques.

The Camera : Computational Photography Alexei Efros, CMU, Fall 2005

Computer Vision. The Pinhole Camera Model

The Camera : Computational Photography Alexei Efros, CMU, Fall 2008

Deconvolution , , Computational Photography Fall 2017, Lecture 17

Image Formation: Camera Model

LENSLESS IMAGING BY COMPRESSIVE SENSING

High Performance Imaging Using Large Camera Arrays

Active Aperture Control and Sensor Modulation for Flexible Imaging

INTRODUCTION THIN LENSES. Introduction. given by the paraxial refraction equation derived last lecture: Thin lenses (19.1) = 1. Double-lens systems

Projection. Readings. Szeliski 2.1. Wednesday, October 23, 13

Virtual Reality I. Visual Imaging in the Electronic Age. Donald P. Greenberg November 9, 2017 Lecture #21

Topic 6 - Optics Depth of Field and Circle Of Confusion

Colour correction for panoramic imaging

HDR imaging Automatic Exposure Time Estimation A novel approach

A Framework for Analysis of Computational Imaging Systems

Digital Photographic Imaging Using MOEMS

White Paper High Dynamic Range Imaging

Capturing Light in man and machine. Some figures from Steve Seitz, Steve Palmer, Paul Debevec, and Gonzalez et al.

Introduction to Digital Photography

Imaging Optics Fundamentals

High Dynamic Range Photography

Digital Photography and Geometry Capture. NBAY 6120 March 9, 2016 Donald P. Greenberg Lecture 4

Introduction to Image Processing and Computer Vision -- Noise, Dynamic Range and Color --

Glossary of Terms (Basic Photography)

Image Formation and Capture

Transcription:

Improving Film-Like Photography aka, Epsilon Photography Ankit Mohan Courtesy of Ankit Mohan. Used with permission.

Film-like like Optics: Imaging Intuition Angle(θ,ϕ) Ray Center of Projection Position (x,y) Well-Lit 3D Scene: 2D Sensor: Pixel Grid or Film, Pinhole Model: Rays copy scene onto film

Film-like like Optics: Imaging Intuition Angle(θ,ϕ) Scene Ray Lens Center of Projection Sensor Position (x,y) Pinhole Model: Rays copy scene onto film

Not One Ray, but a Bundle of Rays Angle(θ,ϕ) Scene Ray Lens Center of Projection Sensor Position (x,y)

Not One Ray, but a Bundle of Rays Scene Lens Sensor Aperture (BUT Ray model isn t perfect: ignores diffraction) Lens, aperture, and diffraction sets the point-spread-function (PSF) (How? See: Goodman,J.W. An Introduction to Fourier Optics 1968)

Review: Lens Measurements Scene Lens Sensor S 1 S 2 How do we compute S 1 and S 2 for a lens? What is the Ray-Bending Strength for a lens?

Review: Focal Length f Lens S 1 = S 2 = f Lens focal length f : where parallel rays converge

Review: Focal Length f Lens S 1 = f Lens focal length f : where parallel rays converge smaller focal length: more ray-bending ability

Review: Focal Length f Lens S 1 = f Lens focal length f : where parallel rays converge greater focal length: less ray-bending ability For flat glass; for air : f =

Review: Thin Lens Law Scene Lens Sensor f f S 1 S 2 Thin Lens Law: in focus when: Note that S 1 f and S 2 f Try it Live! Physlets http://webphysics.davidson.edu/applets/optics/intro.html

Aperture and Depth-Of-Focus: Lens Scene Sensor Focus Depth f f Blur S 1 S 2 For same focal length: Smaller Aperture Æ Larger focus depth, but less light

Aperture and Depth-Of-Focus: Lens Scene Sensor Focus Depth f f Blur S 1 S 2 For same focal length: Larger Aperture Æ smaller focus depth, but more light

Auto-Focus Phase based autofocus: Used in most SLR cameras. Contrast based autofocus: Maximize image contrast in AF region; used in most digital compact cameras. Active autofocus: Ultrasonic and IR based; used in compact film cameras.

Problem: Map Scene to Display Domain of Human Vision: from ~10-6 to ~10 +8 cd/m 2 starlight moonlight office light daylight flashbulb 10-6 10-2 1 10 100 10 10 +4 10 +8???? 0 255 Range of Typical Displays: from ~1 to ~100 cd/m 2

Dynamic Range Limits Under-Exposure Highlight details: Captured Shadow details: Lost Over-Exposure Highlight details: Lost Shadow details: Captured

Shutter Speed Exposure Aperture size Film Sensitivity (ISO) Linear Relationship

Auto-Exposure [Nikon Matrix Metering] Images removed due to copyright restrictions. Scanned product technical literature, similar to that presented at http://www.mir.com.my/rb/photography/hardwares/classics/nikonf4/metering/index1.htm

Color sensing in Digital Cameras Bayer filter pattern Source: Wikipedia Wikipedia User:Cburnett. License CC BY-SA. This content is excluded from our Creative Commons license. For more information, see http://ocw.mit.edu/fairuse. Foveon X3 sensor Source: Wikipedia Wikipedia User:Anoneditor. License CC BY-SA. This content is excluded from our Creative Commons license. For more information, see http://ocw.mit.edu/fairuse.

Electromagnetic spectrum Source: NASA Visible Light: ~400-700 nm wavelength

CIE 1931 Chromaticity Diagram

Three color primaries G srgb color space Fuji Velvia 50 film Nikon D70 camera R B

Epsilon Photography Capture multiple photos, each with slightly different camera parameters. Exposure settings Spectrum/color settings Focus settings Camera position Scene illumination

Epsilon Photography epsilon over time (bracketing) epsilon over sensors (3CCD, SAMP, camera arrays) epsilon over pixels (bayer) epsilon over multiple axes

Epsilon over time (Bracketing) Capture a sequence of images (over time) with epsilon change in parameters

High Dynamic Range (HDR) capture negative film = 250:1 (8 stops) paper prints = 50:1 [Debevec97] = 250,000:1 (18 stops) Old idea; [Mann & Picard 1990] hot topic at recent SIGGRAPHs Images removed due to copyright restrictions. Memorial Church photo sequence from Paul Debevec, Recovering High Dynamic Range Radiance Maps from Photographs. (SIGGRAPH 1997)

Epsilon over time (Bracketing) Prokudin-Gorskii, Sergei Mikhailovich, 1863-1944, photographer. ``The Bukhara Emir, Prints and Photographs Division, Library of Congress.

Epsilon over time (Bracketing) Image courtesy of shannonpatrick17 on Flickr. Color wheel used in DLP projectors

Epsilon over sensors Capture a set of images (over different sensors or cameras) with epsilon change in parameters

Epsilon over sensors 3CCD imaging system for color capture Left Wikipedia User:Cburnett. Upper right Wikipedia User:Xingbo. License CC BY-SA. This content is excluded from our Creative Commons license. For more information, see http://ocw.mit.edu/fairuse..

Epsilon over sensors Single-Axis Multi-Parameter (SAMP) Camera [McGuire et al, 2005] Multiple cameras at the same virtual position Images removed due to copyright restrictions.

Epsilon over sensors Camera Arrays Epsilon over camera position Image removed due to copyright restrictions. 64 tightly packed commodity CMOS webcams, 30 Hz, scalable, real-time [Yang, J. C. et al. "A Real-Time Distributed Light Field Camera." Eurographics Workshop on Rendering (2002), pp. 1 10]

Epsilon over sensors Stanford Camera Array [Wilburn et al, SIGGRAPH 2005] Photo of camera array removed due to copyright restrictions. See High Performance Imaging Using Large Camera Arrays.

Epsilon over pixels Capture images (over different pixels on the same sensor) with epsilon change in parameters

Epsilon over pixels Bayer Mosaicing for color capture Images: Wikipedia. Wikipedia User:Cburnett. License CC BY-SA. This content is excluded from our Creative Commons license. For more information, see http://ocw.mit.edu/fairuse. Estimate RGB at G cells from neighboring values

Epsilon over multiple axes Image removed due to copyright restrictions.

Generalized Mosaicing [Schechner and Nayar, ICCV 2001] 2001 IEEE. Courtesy of IEEE. Used with permission.

HDR From Multiple Measurements Captured Images Computed Image Mitsunaga, T. and S. Nayar. Radiometric Self Calibration. CVPR 1999. Ginosar et al 92, Burt & Kolczynski 93, Madden 93, Tsai 94, Saito 95, Mann & Picard 95, Debevec & Malik 97, Mitsunaga & Nayar 99, Robertson et al. 99, Kang et al. 03 1999 IEEE. Courtesy of IEEE. Used with permission.

Sequential Exposure Change: Ginosar et al 92, Burt & Kolczynski 93, Madden 93, Tsai 94, Saito 95, Mann 95, Debevec & Malik 97, Mitsunaga & Nayar 99, Robertson et al. 99, Kang et al. 03 time Mosaicing with Spatially Varying Filter: (Pan or move the camera) Schechner and Nayar 01, Aggarwal and Ahuja 01 time Multiple Image Detectors: Doi et al. 86, Saito 95, Saito 96, Kimura 98, Ikeda 98, Aggarwal & Ahuja 01,

Multiple Sensor Elements in a Pixel: Handy 86, Wen 89, Murakoshi 94, Konishi et al. 95, Hamazaki 96, Street 98 Assorted Pixels: Generalized Bayer Grid: Trade resolution for multiple exposure,color Nayar and Mitsunaga 00, Nayar and Narasimhan 02 Computational Pixels: (pixel sensivity set by its illumination) Brajovic & Kanade 96, Ginosar & Gnusin 97 Serafini & Sodini 00

Assorted Pixels [Nayar and Narsihman 03] R G R G R G R G G B G B G B G B R G R G R G R G G B G B G B G B R G R G R G R G G B G B G B G B R G R G R G R G G B G B G B G B Bayer Grid Interleaved color filters. Lets interleave OTHER assorted measures too De-mosaicking helps preserve resolution

Assorted Pixels [Nayar and Narsihman 03] Digital Still Camera Camera with Assorted Pixels

attenuator element LCD Adaptive Light Attenuator light T t+1 [Nayar and Branzoi, ICCV 2003] Unprotected 8-bit Sensor Output: detector element I t Controller LCD Light Attenuator limits image intensity reaching 8-bit sensor Attenuator- Protected 8-bit Sensor Output Photos 2003 IEEE. Courtesy of IEEE. Used with permission.

High Dynamic Range (HDR) display [Seetzen, Heidrich, et al, SIGGRAPH 2004] Image removed due to copyright restrictions. Schematic of HDR display with projector, LCD and optics; and photo of the working display. See Figure 4 in Seetzen, H., et al. High Dynamic Range Display Systems. ACM Transactions on Graphics (Proceedings of SIGGRAPH 2004) 23, no. 3 (August 2004): 760-768. http:/ / citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.128.1621&rep=rep1&type=pdf

Focus: extending the depth of field Focal stacks - used in microscopy Light field cameras

FUSION: Best-Focus Distance Source images Graph Cuts Solution FUSION Several slides removed due to copyright restrictions. Sequence of photos of insect head, with progression of different focal points. See Extended depth-of-field example at: Agarwala, A., et al. Interactive Digital Photomontage. Agrawala et al., Digital Photomontage SIGGRAPH 2004

Focus: Light field camera Light field focal stack extended DOF Courtesy of Ren Ng. Used with permission.

Focus: shallow depth of field Lots of glass; Heavy; Bulky; Expensive Example photos removed due to copyright restrictions.

Defocus Magnification [Bae and Durand 2007] Images removed due to copyright restrictions. See Figure 1 in Bae, S., and F. Durand. "Defocus Magnification." Comput Graph Forum 26, no. 3 (2007): 571-579.

Synthetic aperture photography Huge lens ray bundle is now summed COMPUTATIONALLY: Σ

Synthetic aperture photography Computed image: large lens ray bundle Summed for each pixel Σ

Camera array gathers and sums the same sets of rays Synthetic aperture photography Impossibly Large lens: Lens gathers a bundle of rays for each image point Σ

Synthetic aperture photography Camera array is far away from these bushes, yet it sees Vaish, V., et al. "Using Plane + Parallax for Calibrating Dense Camera Arrays." Proceedings of CVPR 2004. Courtesy of IEEE. Used with permission. 2004 IEEE.

Focus Adjustment: Sum of Bundles [Vaish et al. 2004] Vaish, V., et al. "Using Plane + Parallax for Calibrating Dense Camera Arrays." Proceedings of CVPR 2004. Courtesy of IEEE. Used with permission. 2004 IEEE.

Uncalibrated Synthetic Aperture [Kusumoto, Hiura, Sato, CVPR 2009] 2009 IEEE. Courtesy of IEEE. Used with permission.

Uncalibrated Synthetic Aperture [Kusumoto, Hiura, Sato, CVPR 2009] Focus in front Focus in back 2009 IEEE. Courtesy of IEEE. Used with permission.

Image Destabilization [Mohan, Lanman et al. 2009] Camera Lens Sensor Static Scene

Image Destabilization [Mohan, Lanman et al. 2009] Camera Static Scene Lens Motion Sensor Motion

MIT Media Lab Lens based Focusing Lens Sensor A B B A

MIT Media Lab Lens based Focusing Lens Sensor A B B A

MIT Media Lab Smaller aperture Æ Smaller defocus blur Lens Sensor A B B A

MIT Media Lab Pinhole: All In-Focus Pinhole Sensor A B B A

MIT Media Lab Shifting Pinhole Pinhole Sensor A v p B B A

MIT Media Lab Shifting Pinhole Pinhole Sensor A v p B B A

MIT Media Lab Shifting Pinhole Pinhole Sensor A v p B B A

MIT Media Lab Shifting Pinhole Pinhole Sensor A v p B B A

MIT Media Lab Shifting Pinhole Pinhole Sensor A v p B t p B A d a d b d s

MIT Media Lab Shifting Pinhole and Sensor Pinhole Sensor A v p B v s B d a A Focus Here d b d s

MIT Media Lab Shifting Pinhole and Sensor Pinhole Sensor A v p B v s B A d a Focus Here d b d s

MIT Media Lab Shifting Pinhole and Sensor Pinhole Sensor B A v p v s A B d a Focus Here d b d s

MIT Media Lab Shifting Pinhole and Sensor Pinhole Sensor B A v p v s A B d a Focus Here d b d s

MIT Media Lab A Lens in Time! Lens Equation: Virtual Focal Length: Virtual F-Number: Analogous to shift and sum based Light field re-focusing.

MIT Media Lab Our Prototype 2009 IEEE. Courtesy of IEEE. Used with permission.

MIT Media Lab Adjusting the Focus Plane all-in-focus pinhole image 2009 IEEE. Courtesy of IEEE. Used with permission.

MIT Media Lab Defocus Exaggeration destabilization simulates a reduced f-number 2009 IEEE. Courtesy of IEEE. Used with permission.

Large apertures with tiny lenses? Benefits No time or light inefficiency wrt cheap cameras Exploits unused area around the lens Compact design With near-pinhole apertures (mobile phones) many possibilities Limitations Coordinated mechanical movement required Diffraction (due to small aperture) cannot be eliminated [Zhang and Levoy, tomorrow] [Our group: augmented LF for wave analysis] Scene motion during exposure Figure by MIT OpenCourseWare. Photo courtesy of Wikipedia User: Lipton_sale.

Increasing Spatial Resolution Superresolution Panoramas over time Panoramas over sensors

Capturing Gigapixel Images [Kopf et al, SIGGRAPH 2007] Image removed due to copyright restrictions. See Fig. 4b in Kopf, J., et al. Capturing and Viewing Gigapixel Images. Proceedings of SIGGRAPH 2007. 3,600,000,000 Pixels Created from about 800 8 MegaPixel Images

Capturing Gigapixel Images [Kopf et al, 2007] Image removed due to copyright restrictions. See Fig. 4b in Kopf, J., et al. Capturing and Viewing Gigapixel Images. Proceedings of SIGGRAPH 2007. 150 degrees Normal perspective projections cause distortions.

Capturing Gigapixel Images [Kopf et al, 2007] Image removed due to copyright restrictions. See Fig. 4b in Kopf, J., et al. Capturing and Viewing Gigapixel Images. Proceedings of SIGGRAPH 2007. 100X variation in Radiance High Dynamic Range

A tiled camera array Photo removed due to copyright restrictions. See http://graphics.stanford.edu/projects/array/ images/tiled-side-view-cessh.jpg (Figure 1a in Wilburn, B., et al. SIGGRAPH 2005) 12 8 array of VGA cameras abutted: 7680 3840 pixels overlapped 50%: half of this total field of view = 29 wide (seamless mosaicing isn t hard) cameras individually metered Approx same center-of-proj.

Tiled panoramic image (before geometric or color calibration) Photo removed due to copyright restrictions.

Tiled panoramic image (after geometric or color calibration) Photo removed due to copyright restrictions.

same exposure in all cameras 1/60 1/60 1/60 1/60 Three images removed due to copyright restrictions. Similar to Fig. 6 and 7 in Wilburn, B., et al. High Performance Imaging Using Large Camera Arrays. Proceedings of SIGGRAPH 2005. individually metered 1/120 1/60 1/60 1/30 same and overlapped 50% 1/120 1/60 1/60 1/30

Increasing Temporal Resolution Say you want 120 frame per second (fps) video. You could get one camera that runs at 120 fps Or

Increasing Temporal Resolution Say you want 120 frame per second (fps) video. You could get one camera that runs at 120 fps Or get 4 cameras running at 30 fps.

Increasing Temporal Resolution High Speed Video Using a Dense Camera Array [Wilburn et al, CVPR 2004] 1560fps video of popping balloon 2004 IEEE. Courtesy of IEEE. Used with permission.

Epsilon Photography Modify Exposure settings Spectrum/color settings Focus settings Camera position Scene illumination over time (bracketing) sensors (SAMP, camera arrays) pixels (bayer)

MIT OpenCourseWare http://ocw.mit.edu MAS.531 Computational Camera and Photography Fall 2009 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.