Lecture 18: Light field cameras. (plenoptic cameras) Visual Computing Systems CMU , Fall 2013

Similar documents
Capturing Light. The Light Field. Grayscale Snapshot 12/1/16. P(q, f)

Light field sensing. Marc Levoy. Computer Science Department Stanford University

Dappled Photography: Mask Enhanced Cameras for Heterodyned Light Fields and Coded Aperture Refocusing

Wavefront coding. Refocusing & Light Fields. Wavefront coding. Final projects. Is depth of field a blur? Frédo Durand Bill Freeman MIT - EECS

Light-Field Database Creation and Depth Estimation

Computational Cameras. Rahul Raguram COMP

Computational Approaches to Cameras

Admin. Lightfields. Overview. Overview 5/13/2008. Idea. Projects due by the end of today. Lecture 13. Lightfield representation of a scene

Light field photography and microscopy

To Do. Advanced Computer Graphics. Outline. Computational Imaging. How do we see the world? Pinhole camera

Coding and Modulation in Cameras

Introduction to Light Fields

Coded photography , , Computational Photography Fall 2018, Lecture 14

Dynamically Reparameterized Light Fields & Fourier Slice Photography. Oliver Barth, 2009 Max Planck Institute Saarbrücken

Lecture 22: Cameras & Lenses III. Computer Graphics and Imaging UC Berkeley CS184/284A, Spring 2017

Single-shot three-dimensional imaging of dilute atomic clouds

Coded photography , , Computational Photography Fall 2017, Lecture 18

Why is sports photography hard?

Lenses, exposure, and (de)focus

DEPTH FUSED FROM INTENSITY RANGE AND BLUR ESTIMATION FOR LIGHT-FIELD CAMERAS. Yatong Xu, Xin Jin and Qionghai Dai

Lytro camera technology: theory, algorithms, performance analysis

Lecture 19: Depth Cameras. Kayvon Fatahalian CMU : Graphics and Imaging Architectures (Fall 2011)

The ultimate camera. Computational Photography. Creating the ultimate camera. The ultimate camera. What does it do?

Coded Aperture and Coded Exposure Photography

Image Formation and Camera Design

Computational Photography: Principles and Practice

What will be on the midterm?

Angle Sensitive Imaging: A New Paradigm for Light Field Imaging

Real Time Focusing and Directional Light Projection Method for Medical Endoscope Video

Film Cameras Digital SLR Cameras Point and Shoot Bridge Compact Mirror less

Tomorrow s Digital Photography

Virtual and Digital Cameras

Demosaicing and Denoising on Simulated Light Field Images

Simulated Programmable Apertures with Lytro

Coded Computational Photography!


Full Resolution Lightfield Rendering

Modeling the calibration pipeline of the Lytro camera for high quality light-field image reconstruction

Topic 6 - Optics Depth of Field and Circle Of Confusion

Get the Shot! Photography + Instagram Workshop September 21, 2013 BlogPodium. Saturday, 21 September, 13

Cameras. Steve Rotenberg CSE168: Rendering Algorithms UCSD, Spring 2017

Time-Lapse Light Field Photography With a 7 DoF Arm

Computational Photography Introduction

IMAGE FORMATION. Light source properties. Sensor characteristics Surface. Surface reflectance properties. Optics

Camera Image Processing Pipeline: Part II

Ultra-shallow DoF imaging using faced paraboloidal mirrors

Intro to Virtual Reality (Cont)

Why is sports photography hard?

Making the right lens choice All images Paul Hazell

Mastering Y our Your Digital Camera

This document explains the reasons behind this phenomenon and describes how to overcome it.

Criteria for Optical Systems: Optical Path Difference How do we determine the quality of a lens system? Several criteria used in optical design

Improving Film-Like Photography. aka, Epsilon Photography

MAS.963 Special Topics: Computational Camera and Photography

Name: Date: Math in Special Effects: Try Other Challenges. Student Handout

One Week to Better Photography

Digital camera modes explained: choose the best shooting mode for your subject

Spatial Resolution and Contrast of a Focused Diffractive Plenoptic Camera

TAKING GREAT PICTURES. A Modest Introduction

UNDERSTANDING THE EXPOSURE TRIANGLE. By Ken Haubrich

Basic principles of photography. David Capel 346B IST

Macro and Close-up Photography

Camera Image Processing Pipeline: Part II

Basic Camera Concepts. How to properly utilize your camera

Modeling and Synthesis of Aperture Effects in Cameras

Name Digital Imaging I Chapters 9 12 Review Material

TAKING GREAT PICTURES. A Modest Introduction

Understanding Focal Length

Development of airborne light field photography

Chapter 6-Existing Light Photography

La photographie numérique. Frank NIELSEN Lundi 7 Juin 2010

The Fresnel Zone Light Field Spectral Imager

PTC School of Photography. Beginning Course Class 2 - Exposure

mastering manual week one

Lecture 30: Image Sensors (Cont) Computer Graphics and Imaging UC Berkeley CS184/284A

SHAW ACADEMY. Lesson 8 Course Notes. Diploma in Photography

Introduction to Digital Photography

6.A44 Computational Photography

Camera Mechanics & camera function. Daily independent reading:pgs. 1-5 Silently read for 10 min. Note taking led by Mr. Hiller

25 Questions. All are multiple choice questions. 4 will require an additional written response explaining your answer.

Technologies Explained PowerShot G16, PowerShot S120, PowerShot SX170 IS, PowerShot SX510 HS

CCD Requirements for Digital Photography

How do we see the world?

An Introduction to. Photographic Exposure: Aperture, ISO and Shutter Speed

Toward Non-stationary Blind Image Deblurring: Models and Techniques

Basic Camera Craft. Roy Killen, GMAPS, EFIAP, MPSA. (c) 2016 Roy Killen Basic Camera Craft, Page 1

Dr F. Cuzzolin 1. September 29, 2015

Elements of Exposure

Design of a digital holographic interferometer for the. ZaP Flow Z-Pinch

Focus Shift, the Basics: Stacking Focus

Camera Simulation. References. Photography, B. London and J. Upton Optics in Photography, R. Kingslake The Camera, The Negative, The Print, A.

Dictionary Learning based Color Demosaicing for Plenoptic Cameras

Near-Invariant Blur for Depth and 2D Motion via Time-Varying Light Field Analysis

aperture, shutter speed

Depth-Based Image Segmentation

Chapter 11-Shooting Action

arxiv: v2 [cs.cv] 31 Jul 2017

A 3D Multi-Aperture Image Sensor Architecture

CAMERA BASICS. Stops of light

Light Field based 360º Panoramas

Transcription:

Lecture 18: Light field cameras (plenoptic cameras) Visual Computing Systems

Continuing theme: computational photography Cameras capture light, then extensive processing produces the desired image Today: - Capturing light fields (not just photographs) with a handheld camera - Implications for photography

Recall: the light-field Light field is a 4D function (represents light in free space: no occlusion) [Image credit: Levoy and Hanrahan 96] Two-plane parameterization: Ray described by connecting point on (u,v) plane with point on (s,t) plane More general: plenoptic function (Adelson and Bergen 1991)

Light field inside a camera Ray space plot Scene focal plane (only showing X-U 2D projection) Field of view U Lens aperture: (U,V) Pixel P1 Pixel P2 X Pixel P1 Pixel P2 Sensor plane: (X,Y)

Decrease aperture size Ray space plot Scene focal plane U Lens aperture: (U,V) Pixel P1 Pixel P2 X Pixel P1 Pixel P2 Sensor plane: (X,Y)

Defocus Ray space plot Scene focal plane U Lens aperture: (U,V) Pixel P1 Pixel P2 Sensor plane: (X,Y) Pixel P1 Pixel P2 X Circle of confusion

Defocus Ray space plot Scene focal plane U Lens aperture: (U,V) Pixel P1 Pixel P2 X Sensor plane: (X,Y)

Stanford Camera Array Wilburn et al. 2005 640 x 480 tightly synchronized, repositionable cameras Custom processing board per camera Tethered to PCs for additional processing/storage Host PC with disk array

Capturing a light field

Synthetic aperture Simulate image formation by virtual camera with large aperture Shift and add images Virtual Lens Wilburn et al. 2005

Refocused synthetic aperture image Virtual focal plane Virtual Lens

Plenoptic camera [Adelson and Wang, 1992] Measure plenoptic function for single lens stereo applications

Handheld light field camera World plane of focus Lens aperture: (U,V) Ng et al. 2005 Microlens array Sensor plane: (X,Y) Pixel 1 Pixel 2

Each sensor pixel records a beam of light World plane of focus Ray space plot U Pixel 1 Lens aperture: (U,V) X Microlens array Sensor plane: (X,Y) Pixel 1

Captured light field 16 MP sensor 296 x 296 micolens array 12 x 12 pixels per microlens Image: Ng et al. 2006

Computing a photograph World plane of focus Ray space plot U Pixel 1 Lens aperture: (U,V) Pixel 6 X Microlens array Sensor plane: (X,Y) Pixel 1 Pixel 6

Sub-aperture image World plane of focus Lens aperture: (U,V) Microlens array Sensor plane: (X,Y)

Sub-aperture images Each image displays light incident on sensor from a small region of aperture Note slight shift in perspective Image: Ng et al. 2006

Digital refocusing Lens aperture: (U,V) Virtual sensor plane: (X,Y ) Microlens array Sensor plane: (X,Y)

Digital refocusing Image: Ng et al. 2006

Reparameterization F plane: plane of microlens array F plane: virtual plane of focus LF = light field parameterized by lens and F plane LF = light field parameterized by lens and F planes Can define LF using LF (Virtual ) Image: Ng et al. 2006

Refocused photograph Integrate all light arriving at point (x,y ) on F plane Define L F (u,v) to be sub-aperture image from lens region (u,v) (Virtual ) Sum of shifted, scaled sub-aperture images Scale image by (can ignore, invariant of lens position) Shift image by ( u(1-1/ ), v(1-1/ ) ) Image: Ng et al. 2006

Video

Potential advantages of light-field cameras (For traditional photography) Remove (or significantly simplify) auto-focus - Diminished shutter lag Better low-light shooting - Shoot with aperture wide open (traditional camera has shallow depth of field = high possibility of misfocus) - Can digitally refocus after the shot - Can digitally extend depth of field New lens form factors, capabilities - Correct for aberrations digitally Image: Ng et al. 2006

New photography applications Interactive pictures (single shot captures information that can be used to generate many different pictures) - Digital (post shot) refocusing - Parallax Stereo images Extended depth of field (put entire image in focus)

Lytro light field camera 11 megapixel ( megaray ) sensor F/2 8x zoom lens

More computational cameras Raytrix Plenoptic Camera Pelican Imaging

Computational challenges? What are computational challenges of light field photography?

Trends No free lunch: measure directional information at cost of spatial resolution - Ng s original prototype: 16 MP sensor, but output 300x300 images - Lytro camera: 11MP sensor, ~1MP output images Light field cameras can make use of increasing sensor pixel densities - More directional resolution = increased refocusing capability - More spatial resolution at fixed directional resolution - Recall: few motivations high-pixel-count sensors for traditional cameras today High resolution cameras introduce computational challenges - Processing challenges - Storage challenges - Data transfer challenges

Modern photography: capture-process-communicate Where to perform computation? What representation to transmit? Full light field? Single image? Cloud Storage/ Processing Future consumer light field camera ~ 50-100 MP Personal Computer

Summary Light field photography - From user s perspective, very much like traditional photography - Main idea: capture light field in a single exposure - Perform (large amounts of) computation to compute desired final image