Introduction to Light Fields

Size: px
Start display at page:

Download "Introduction to Light Fields"

Transcription

1 MIT Media Lab Introduction to Light Fields Camera Culture Ramesh Raskar MIT Media Lab

2 Introduction to Light Fields Ray Concepts for 4D and 5D Functions Propagation of Light Fields Interaction with Occluders Fourier Domain Analysis and Relationship to Fourier Optics Coded Photography: Modern Methods to Capture Light Field Wigner and Ambiguity Function for Light Field in Wave Optics New Results in Augmenting Light Fields

3 Light Fields Goal: Representing propagation, interaction and image formation of light using purely position and angle parameters Radiance per ray Ray parameterization: position direction Position : s, x, r Direction : u, θ, s Reference plane Courtesy of Se Baek Oh. Used with permission.

4 Limitations of Traditional Lightfields rigorous but cumbersome wave optics based Traditional Light Field Wigner Distribution Function hologram s beam shaping ray optics based simple and powerful limited in diffraction & interference rotational PSF Courtesy of Se Baek Oh. Used with permission. Se Baek 3D Optical Systems Group CVPR 2009

5 Example: New Representations Augmented Lightfields rigorous but cumbersome wave optics based Wigner Distribution Function WDF Augmented LF Traditional Light Field Traditional Light Field ray optics based simple and powerful limited in diffraction & interference Se Baek 3D Optical Courtesy of Se Baek Oh. Used with permission. Interference & Diffraction Interaction w/ optical elements Non-paraxial propagation Systems Group CVPR 2009

6 The Plenoptic Function Figure removed due to copyright restrictions. Q: What is the set of all things that we can ever see? A: The Plenoptic Function (Adelson & Bergen) Let s s start with a stationary person and try to parameterize everything that he can see

7 Grayscale snapshot Figure removed due to copyright restrictions. P(θ,φ) is intensity of light Seen from a single view point At a single time Averaged over the wavelengths of the visible spectrum (can also do P(x,y), but spherical coordinate are nicer)

8 Color snapshot Figure removed due to copyright restrictions. P(θ,φ,λ) is intensity of light Seen from a single view point At a single time As a function of wavelength

9 A movie Figure removed due to copyright restrictions. P(θ,φ,λ,t) is intensity of light Seen from a single view point Over time As a function of wavelength

10 Holographic movie Figure removed due to copyright restrictions. is intensity of light Seen from ANY viewpoint Over time As a function of wavelength P(θ,φ,λ,t,V X,V Y,V Z )

11 The Plenoptic Function Figure removed due to copyright restrictions. P(θ,φ,λ,t,V X,V Y,V Z ) Can reconstruct every possible view, at every moment, from every position, at every wavelength Contains every photograph, every movie, everything that anyone has ever seen.

12 Sampling Plenoptic Function (top view)

13 Ray Let s s not worry about time and color: 5D 3D position 2D direction P(θ,φ,V X,V Y,V Z ) Courtesy of Rick Szeliski and Michael Cohen. Used with permission. Slide by Rick Szeliski and Michael Cohen

14 Ray No Occluding Objects 4D 2D position 2D direction P(θ,φ,V X,V Y,V Z ) The space of all lines in 3-D 3 D space is 4D. Courtesy of Rick Szeliski and Michael Cohen. Used with permission. Slide by Rick Szeliski and Michael Cohen

15 Lumigraph/Lightfield - Organization 2D position 2D direction θ s Courtesy of Rick Szeliski and Michael Cohen. Used with permission. Slide by Rick Szeliski and Michael Cohen

16 2D position 2D position s u 2 plane parameterization Courtesy of Rick Szeliski and Michael Cohen. Used with permission. Slide by Rick Szeliski and Michael Cohen

17 2D position 2D position t s,t s,t u,v v 2 plane parameterization u,v s u Courtesy of Rick Szeliski and Michael Cohen. Used with permission. Slide by Rick Szeliski and Michael Cohen

18 Light Field = Array of (virtual) Cameras Sub-aperture Virtual Camera = Sub-aperture View Σ Based on original slide by Marc Levoy. Used with permission Marc Levoy

19 Conventional versus plenoptic camera Scene Pixel = (s,t) Virtual Camera = (u,v) uv-plane Pixel = (s,t) st-plane Based on original slide by Marc Levoy. Used with permission Marc Levoy

20 Light Field = Array of (virtual) Cameras Σ Based on original slide by Marc Levoy. Used with permission Marc Levoy

21 Light Field = Array of (virtual) Cameras Sub-aperture Virtual Camera = Sub-aperture View Σ Courtesy of Marc Levoy. Used with permission Marc Levoy

22 Light Field = Array of (virtual) Cameras Sub-aperture Virtual Camera = Sub-aperture View Σ Based on original slide by Marc Levoy. Used with permission Marc Levoy

23 Light Field Inside a Camera Courtesy of Ren Ng. Used with permission.

24 Light Field Inside a Camera Lenslet-based Light Field camera [Adelson and Wang, 1992, Ng et al ] Courtesy of Ren Ng. Used with permission.

25 Stanford Plenoptic Camera [Ng et al 2005] Contax medium format camera Kodak 16-megapixel sensor Adaptive Optics microlens array 125μ square-sided microlenses Courtesy of Ren Ng. Used with permission pixels lenses = pixels per lens

26 Digital Refocusing [Ng et al 2005] Courtesy of Ren Ng. Used with permission.

27 Adaptive Optics A deformable mirror can be used to correct wavefront errors in an astronomical telescope _optics_correct.png

28 Shack Hartmann wavefront sensor (commonly used in Adaptive optics).

29 Measuring shape of wavefront = Lightfield Capture Courtesy of David Williams the Center for Visual Science, University of Rochester. Used with permission. The spots formed on the CCD chip for the eye will be displaced because the wavefront will hit each lenslet at an angle rather than straight on.

30 Example using 45 cameras [Vaish CVPR 2004] Vaish, V., et al. "Using Plane + Parallax for Calibrating Dense Camera Arrays." Proceedings of CVPR Courtesy of IEEE. Used with permission IEEE. Courtesy of Marc Levoy. Used with permission Marc Levoy

31 Synthetic aperture videography Image removed due to copyright restrictions.

32 Vaish, V., et al. "Using Plane + Parallax for Calibrating Dense Camera Arrays." Proceedings of CVPR Courtesy of IEEE. Used with permission IEEE.

33 x 1 θ i θ j x 2 θ j θ Visualizing Lightfield (i)position-angle space (ii)phase-space (iii)space- Spatial Frequency (iv)spectrogram θi θ l(x,θ) θj x 2 x 1 x x l(x,θ)

34 x 1 = x 1 + θ i *z x 1 θ i θ j x 2 θ j Shear of Light Field θ θi θ l(x,θ) θj x 2 x 1 x x x 1 x' 1 l(x,θ)

35 θ l(x,θ) x

36 10 0 θ l(x,θ) 10 0 θ θ l(x,θ) x x

37 10 0 θ l(x,θ) 10 0 θ θ l(x,θ) x x

38 Light Field = Array of (virtual) Cameras Sub-aperture Virtual Camera = Sub-aperture View Σ Courtesy of Marc Levoy. Used with permission Marc Levoy

39 Three ways to capture LF inside a camera Shadows using pin-hole array Refraction using lenslet array Heterodyning using masks

40 Sub-Aperture = Pin-hole + Prism Optical Society of America and H. E. Ives. All rights reserved. This content is excluded from our Creative Commons license. For more information, see

41 Ives 1933 Optical Society of America and H. E. Ives. All rights reserved. This content is excluded from our Creative Commons license. For more information, see

42 MERL, MIT Media Lab Glare Aware Photography: 4D Ray Sampling for Reducing Glare Raskar, Agrawal, Wilson & Veeraraghavan

43 MERL, MIT Media Lab Glare Aware Photography: 4D Ray Sampling for Reducing Glare Raskar, Agrawal, Wilson & Veeraraghavan

44 Lens Glare Reduction [Raskar, Agrawal, Wilson, Veeraraghavan SIGGRAPH 2008] Glare/Flare due to camera lenses reduces contrast

45 MERL, MIT Media Lab Glare Aware Photography: 4D Ray Sampling for Reducing Glare Raskar, Agrawal, Wilson & Veeraraghavan Reducing Glare Conventional Photo After removing outliers Glare Reduced Image Raskar, R., et al. Glare Aware Photography: 4D Ray Sampling for Reducing Glare Effects of Camera Lenses. Proceedings of SIGGRAPH 2008.

46 Light Field Inside a Camera Lenslet-based Light Field camera [Adelson and Wang, 1992, Ng et al ] Courtesy of Ren Ng. Used with permission.

47 Prototype camera Contax medium format camera Kodak 16-megapixel sensor Adaptive Optics microlens array 125μ square-sided microlenses Courtesy of Ren Ng. Used with permission pixels lenses = pixels per lens

48 Courtesy of Ren Ng. Used with permission.

49 Zooming into the raw photo Courtesy of Ren Ng. Used with permission Marc Levoy

50 Digital Refocusing [Ng et al 2005] Courtesy of Ren Ng. Used with permission. Can we achieve this with a Mask alone?

51 Mask based Light Field Camera Mask Sensor [Veeraraghavan, Raskar, Agrawal, Tumblin, Mohan, Siggraph 2007 ]

52 How to Capture 4D Light Field with 2D Sensor? What should be the pattern of the mask?

53 Lens Copies the Lightfield of Conjugate Plane Object Main Lens 1D Sensor θ -plane x-plane x 0 x 0 θ 0 x θ θ 0 x

54 Object Main Lens 1D Sensor θ -plane x-plane θ l(x,θ) x Line Integral Captured Photo

55 Object Main Lens 1D Sensor θ -plane x-plane θ l(x,θ) x Line Integral Captured Photo

56 Fourier Slice Theorem θ l(x,θ) 2-D FFT f θ L(f x,f θ ) x f x Line Integral Central Slice 1-D FFT Captured Photo FFT of Captured Photo

57 Light Propagation (Defocus Blur) θ l(x,θ) 2-D FFT f θ L(f x,f θ ) x f x Line Integral Central Slice 1-D FFT Captured Photo FFT of Captured Photo

58 In Focus Photo LED

59 Out of Focus Photo: Open Aperture

60 Coded Aperture Camera The aperture of a 100 mm lens is modified Insert a coded mask with chosen binary pattern Rest of the camera is unmodified

61 Out of Focus Photo: Coded Aperture

62 Modeling and Synthesis of Aperture Effects in Cameras Douglas Lanman, Ramesh Raskar, and Gabriel Taubin Computational Aesthetics June,

63 Slides removed due to copyright restrictions. See this paper and associated presentation at

64 Cosine Mask Used Mask Tile 1/f 0

65 Captured 2D Photo Encoding due to Mask

66 Veraraghavan, Raskar, Agrawal, Mohan, Tumblin. Dappled Photography: Mask Enhanced Cameras for Heterodyned Light Fields and Coded Aperture Refocusing. Proceedings of SIGGRAPH D FFT Traditional Camera Photo Magnitude of 2D FFT 2D FFT Heterodyne Camera Photo Magnitude of 2D FFT

67 Extra sensor bandwidth cannot capture extra angular dimension of the light field f θ f θ0 Extra sensor bandwidth f x0 f x Sensor Slice Fourier Light Field Space (Wigner Transform)

68 Sensor Slice captures entire Light Field f θ f θ0 f x0 f x Modulation Function Modulated Light Field

69 Where to place the Mask? Mask Sensor f θ f x Mask Modulation Function

70 Computing 4D Light Field 2D Sensor Photo, 1800*1800 2D Fourier Transform, 1800*1800 2D FFT 9*9=81 spectral copies 4D Light Field 4D IFFT Rearrange 2D tiles into 4D planes 200*200*9*9 200*200*9*9 Veraraghavan, Raskar, Agrawal, Mohan, Tumblin. Dappled Photography: Mask Enhanced Cameras for Heterodyned Light Fields and Coded Aperture Refocusing. Proceedings of SIGGRAPH 2007.

71 x 1 = x 1 + θ i *z x 1 θ i θ j x 2 θ j Shear of Light Field θ θi θ l(x,θ) θj x 2 x 1 x x x 1 x' 1 l(x,θ)

72 Light Propagation (Defocus Blur) θ l(x,θ) 2-D FFT f θ L(f x,f θ ) x f x Line Integral Central Slice 1-D FFT Captured Photo FFT of Captured Photo

73 MERL Mask-Enhanced Cameras: Heterodyned Light Fields & Coded Aperture Veeraraghavan, Raskar, Agrawal, Mohan & Tumblin Sensor Sensor Microlens array Mask Plenoptic Camera Heterodyne Camera Samples individual rays Samples coded combination of rays Predefined spectrum for lenses Chromatic abberration Supports any wavelength High alignment precision Reconfigurable f/#, Easier alignment Peripheral pixels wasted pixels No wastage High resolution image for parts of scene in focus Negligible Light Loss 50 % Light Loss due to mask

74 Space of LF representations Time-frequency representations Phase space representations Quasi light field Other LF representations Other LF representations Observable LF Traditional light field Augmented LF WDF Rihaczek Distribution Function incoherent Courtesy of Se Baek Oh. Used with permission. coherent

75 Quasi light fields the utility of light fields, the versatility of Maxwell Other LF representatio ns Rihaczek Distribution Function Other LF representatio ns Observable LF Traditiona l light field incoherent Augmented LF coherent WDF We form coherent images by formulating, capturing, and integrating quasi light fields. Courtesy of Se Baek Oh. Used with permission.

76 (i) Observable Light Field move aperture across plane look at directional spread continuous form of plenoptic camera Courtesy of Se Baek Oh. Used with permission. scene aperture position s direction u

77 (ii) Augmented Light Field with LF Transformer light field transformer WDF Augmented LF Light Field LF LF LF LF LF propagation (diffractive) optical element negative radiance LF propagation Interaction at the optical elements Courtesy of Se Baek Oh. Used with permission. 7

78 Virtual light projector with real valued (possibly negative radiance) along a ray real projector first null (OPD = λ/2) virtual light projector real projector Courtesy of Se Baek Oh. Used with permission. 7

79 (ii) ALF with LF Transformer Courtesy of Se Baek Oh. Used with permission. 8

80 Tradeoff between cross-interference terms and localization u y (i) Spectrogram non-negative localization (ii) Wigner localization cross terms (iii) Rihaczek localization complex 3 m u 0 m 0 m y 3 m y 0 m 3 m 0 m y 3 m Courtesy of Se Baek Oh. Used with permission.

81 Property of the Representation Constant along rays Non-negativity negativity Coherence Wavelength Interference Cross term Traditional LF always constant always positive only incoherent zero no Observable LF nearly constant always positive any coherence state any yes Augmented LF only in the paraxial region positive and negative any any yes WDF only in the paraxial region positive and negative any any yes Rihaczek DF no; linear drift complex any any reduced Courtesy of Se Baek Oh. Used with permission.

82 Benefits & Limitations of the Representation Ability to propagate Modeling wave optics Simplicity of computation Adaptability to current pipe line Near Field Far Field Traditional LF x-shear no very simple high no yes Observable LF not x-shearx yes modest low yes yes Augmented LF x-shear yes modest high no yes WDF x-shear yes modest low yes yes Rihaczek DF x-shear yes better than WDF, not as simple as LF low no yes Courtesy of Se Baek Oh. Used with permission.

83 Motivation What is the difference between a hologram and a lenticular screen? How they capture phase of a wavefront for telescope applications? What is wavefront coding lens for extended depth of field imaging?

84 Acknowledgements Dartmouth Marcus Testorf, MIT Ankit Mohan, Ahmed Kirmani, Jaewon Kim George Barbastathis Stanford Marc Levoy, Ren Ng, Andrew Adams Adobe Todor Georgiev, MERL Ashok Veeraraghavan, Amit Agrawal

85 MIT Media Lab Light Fields Camera Culture Ramesh Raskar MIT Media Lab CameraCulture. info/

86 MIT OpenCourseWare MAS.531 Computational Camera and Photography Fall 2009 For information about citing these materials or our Terms of Use, visit:

Coded Computational Imaging: Light Fields and Applications

Coded Computational Imaging: Light Fields and Applications Coded Computational Imaging: Light Fields and Applications Ankit Mohan MIT Media Lab Coded Computational Imaging Agrawal, Veeraraghavan, Narasimhan & Mohan Schedule Introduction Assorted Pixels Coding

More information

Capturing Light. The Light Field. Grayscale Snapshot 12/1/16. P(q, f)

Capturing Light. The Light Field. Grayscale Snapshot 12/1/16. P(q, f) Capturing Light Rooms by the Sea, Edward Hopper, 1951 The Penitent Magdalen, Georges de La Tour, c. 1640 Some slides from M. Agrawala, F. Durand, P. Debevec, A. Efros, R. Fergus, D. Forsyth, M. Levoy,

More information

Computational Approaches to Cameras

Computational Approaches to Cameras Computational Approaches to Cameras 11/16/17 Magritte, The False Mirror (1935) Computational Photography Derek Hoiem, University of Illinois Announcements Final project proposal due Monday (see links on

More information

Dappled Photography: Mask Enhanced Cameras for Heterodyned Light Fields and Coded Aperture Refocusing

Dappled Photography: Mask Enhanced Cameras for Heterodyned Light Fields and Coded Aperture Refocusing Dappled Photography: Mask Enhanced Cameras for Heterodyned Light Fields and Coded Aperture Refocusing Ashok Veeraraghavan, Ramesh Raskar, Ankit Mohan & Jack Tumblin Amit Agrawal, Mitsubishi Electric Research

More information

To Do. Advanced Computer Graphics. Outline. Computational Imaging. How do we see the world? Pinhole camera

To Do. Advanced Computer Graphics. Outline. Computational Imaging. How do we see the world? Pinhole camera Advanced Computer Graphics CSE 163 [Spring 2017], Lecture 14 Ravi Ramamoorthi http://www.cs.ucsd.edu/~ravir To Do Assignment 2 due May 19 Any last minute issues or questions? Next two lectures: Imaging,

More information

Computational Camera & Photography: Coded Imaging

Computational Camera & Photography: Coded Imaging Computational Camera & Photography: Coded Imaging Camera Culture Ramesh Raskar MIT Media Lab http://cameraculture.media.mit.edu/ Image removed due to copyright restrictions. See Fig. 1, Eight major types

More information

Coding and Modulation in Cameras

Coding and Modulation in Cameras Coding and Modulation in Cameras Amit Agrawal June 2010 Mitsubishi Electric Research Labs (MERL) Cambridge, MA, USA Coded Computational Imaging Agrawal, Veeraraghavan, Narasimhan & Mohan Schedule Introduction

More information

Lecture 18: Light field cameras. (plenoptic cameras) Visual Computing Systems CMU , Fall 2013

Lecture 18: Light field cameras. (plenoptic cameras) Visual Computing Systems CMU , Fall 2013 Lecture 18: Light field cameras (plenoptic cameras) Visual Computing Systems Continuing theme: computational photography Cameras capture light, then extensive processing produces the desired image Today:

More information

Light field sensing. Marc Levoy. Computer Science Department Stanford University

Light field sensing. Marc Levoy. Computer Science Department Stanford University Light field sensing Marc Levoy Computer Science Department Stanford University The scalar light field (in geometrical optics) Radiance as a function of position and direction in a static scene with fixed

More information

Raskar, Camera Culture, MIT Media Lab. Ramesh Raskar. Camera Culture. Associate Professor, MIT Media Lab

Raskar, Camera Culture, MIT Media Lab. Ramesh Raskar. Camera Culture. Associate Professor, MIT Media Lab Raskar, Camera Culture, MIT Media Lab Camera Culture Ramesh Raskar C C lt Camera Culture Associate Professor, MIT Media Lab Where are the camera s? Where are the camera s? We focus on creating tools to

More information

Coded photography , , Computational Photography Fall 2018, Lecture 14

Coded photography , , Computational Photography Fall 2018, Lecture 14 Coded photography http://graphics.cs.cmu.edu/courses/15-463 15-463, 15-663, 15-862 Computational Photography Fall 2018, Lecture 14 Overview of today s lecture The coded photography paradigm. Dealing with

More information

The ultimate camera. Computational Photography. Creating the ultimate camera. The ultimate camera. What does it do?

The ultimate camera. Computational Photography. Creating the ultimate camera. The ultimate camera. What does it do? Computational Photography The ultimate camera What does it do? Image from Durand & Freeman s MIT Course on Computational Photography Today s reading Szeliski Chapter 9 The ultimate camera Infinite resolution

More information

Admin. Lightfields. Overview. Overview 5/13/2008. Idea. Projects due by the end of today. Lecture 13. Lightfield representation of a scene

Admin. Lightfields. Overview. Overview 5/13/2008. Idea. Projects due by the end of today. Lecture 13. Lightfield representation of a scene Admin Lightfields Projects due by the end of today Email me source code, result images and short report Lecture 13 Overview Lightfield representation of a scene Unified representation of all rays Overview

More information

Light field photography and microscopy

Light field photography and microscopy Light field photography and microscopy Marc Levoy Computer Science Department Stanford University The light field (in geometrical optics) Radiance as a function of position and direction in a static scene

More information

Coded photography , , Computational Photography Fall 2017, Lecture 18

Coded photography , , Computational Photography Fall 2017, Lecture 18 Coded photography http://graphics.cs.cmu.edu/courses/15-463 15-463, 15-663, 15-862 Computational Photography Fall 2017, Lecture 18 Course announcements Homework 5 delayed for Tuesday. - You will need cameras

More information

Computational Cameras. Rahul Raguram COMP

Computational Cameras. Rahul Raguram COMP Computational Cameras Rahul Raguram COMP 790-090 What is a computational camera? Camera optics Camera sensor 3D scene Traditional camera Final image Modified optics Camera sensor Image Compute 3D scene

More information

Coded Computational Photography!

Coded Computational Photography! Coded Computational Photography! EE367/CS448I: Computational Imaging and Display! stanford.edu/class/ee367! Lecture 9! Gordon Wetzstein! Stanford University! Coded Computational Photography - Overview!!

More information

Wavefront coding. Refocusing & Light Fields. Wavefront coding. Final projects. Is depth of field a blur? Frédo Durand Bill Freeman MIT - EECS

Wavefront coding. Refocusing & Light Fields. Wavefront coding. Final projects. Is depth of field a blur? Frédo Durand Bill Freeman MIT - EECS 6.098 Digital and Computational Photography 6.882 Advanced Computational Photography Final projects Send your slides by noon on Thrusday. Send final report Refocusing & Light Fields Frédo Durand Bill Freeman

More information

Wavelengths and Colors. Ankit Mohan MAS.131/531 Fall 2009

Wavelengths and Colors. Ankit Mohan MAS.131/531 Fall 2009 Wavelengths and Colors Ankit Mohan MAS.131/531 Fall 2009 Epsilon over time (Multiple photos) Prokudin-Gorskii, Sergei Mikhailovich, 1863-1944, photographer. Congress. Epsilon over time (Bracketing) Image

More information

Coded Aperture and Coded Exposure Photography

Coded Aperture and Coded Exposure Photography Coded Aperture and Coded Exposure Photography Martin Wilson University of Cape Town Cape Town, South Africa Email: Martin.Wilson@uct.ac.za Fred Nicolls University of Cape Town Cape Town, South Africa Email:

More information

Computational Photography: Principles and Practice

Computational Photography: Principles and Practice Computational Photography: Principles and Practice HCI & Robotics (HCI 및로봇응용공학 ) Ig-Jae Kim, Korea Institute of Science and Technology ( 한국과학기술연구원김익재 ) Jaewon Kim, Korea Institute of Science and Technology

More information

Dappled Photography: Mask Enhanced Cameras for Heterodyned Light Fields and Coded Aperture Refocusing

Dappled Photography: Mask Enhanced Cameras for Heterodyned Light Fields and Coded Aperture Refocusing MITSUBISHI ELECTRIC RESEARCH LABORATORIES http://www.merl.com Dappled Photography: Mask Enhanced Cameras for Heterodyned Light Fields and Coded Aperture Refocusing Ashok Veeraraghavan, Ramesh Raskar, Amit

More information

MAS.963 Special Topics: Computational Camera and Photography

MAS.963 Special Topics: Computational Camera and Photography MIT OpenCourseWare http://ocw.mit.edu MAS.963 Special Topics: Computational Camera and Photography Fall 2008 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.

More information

Simulated Programmable Apertures with Lytro

Simulated Programmable Apertures with Lytro Simulated Programmable Apertures with Lytro Yangyang Yu Stanford University yyu10@stanford.edu Abstract This paper presents a simulation method using the commercial light field camera Lytro, which allows

More information

Improving Film-Like Photography. aka, Epsilon Photography

Improving Film-Like Photography. aka, Epsilon Photography Improving Film-Like Photography aka, Epsilon Photography Ankit Mohan Courtesy of Ankit Mohan. Used with permission. Film-like like Optics: Imaging Intuition Angle(θ,ϕ) Ray Center of Projection Position

More information

Less Is More: Coded Computational Photography

Less Is More: Coded Computational Photography Less Is More: Coded Computational Photography Ramesh Raskar Mitsubishi Electric Research Labs (MERL), Cambridge, MA, USA Abstract. Computational photography combines plentiful computing, digital sensors,

More information

Modeling the calibration pipeline of the Lytro camera for high quality light-field image reconstruction

Modeling the calibration pipeline of the Lytro camera for high quality light-field image reconstruction 2013 IEEE International Conference on Computer Vision Modeling the calibration pipeline of the Lytro camera for high quality light-field image reconstruction Donghyeon Cho Minhaeng Lee Sunyeong Kim Yu-Wing

More information

Full Resolution Lightfield Rendering

Full Resolution Lightfield Rendering Full Resolution Lightfield Rendering Andrew Lumsdaine Indiana University lums@cs.indiana.edu Todor Georgiev Adobe Systems tgeorgie@adobe.com Figure 1: Example of lightfield, normally rendered image, and

More information

Dynamically Reparameterized Light Fields & Fourier Slice Photography. Oliver Barth, 2009 Max Planck Institute Saarbrücken

Dynamically Reparameterized Light Fields & Fourier Slice Photography. Oliver Barth, 2009 Max Planck Institute Saarbrücken Dynamically Reparameterized Light Fields & Fourier Slice Photography Oliver Barth, 2009 Max Planck Institute Saarbrücken Background What we are talking about? 2 / 83 Background What we are talking about?

More information

Ultra-shallow DoF imaging using faced paraboloidal mirrors

Ultra-shallow DoF imaging using faced paraboloidal mirrors Ultra-shallow DoF imaging using faced paraboloidal mirrors Ryoichiro Nishi, Takahito Aoto, Norihiko Kawai, Tomokazu Sato, Yasuhiro Mukaigawa, Naokazu Yokoya Graduate School of Information Science, Nara

More information

Transfer Efficiency and Depth Invariance in Computational Cameras

Transfer Efficiency and Depth Invariance in Computational Cameras Transfer Efficiency and Depth Invariance in Computational Cameras Jongmin Baek Stanford University IEEE International Conference on Computational Photography 2010 Jongmin Baek (Stanford University) Transfer

More information

3.0 Alignment Equipment and Diagnostic Tools:

3.0 Alignment Equipment and Diagnostic Tools: 3.0 Alignment Equipment and Diagnostic Tools: Alignment equipment The alignment telescope and its use The laser autostigmatic cube (LACI) interferometer A pin -- and how to find the center of curvature

More information

Sensing Increased Image Resolution Using Aperture Masks

Sensing Increased Image Resolution Using Aperture Masks Sensing Increased Image Resolution Using Aperture Masks Ankit Mohan, Xiang Huang, Jack Tumblin Northwestern University Ramesh Raskar MIT Media Lab CVPR 2008 Supplemental Material Contributions Achieve

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY. 2.71/2.710 Optics Spring 14 Practice Problems Posted May 11, 2014

MASSACHUSETTS INSTITUTE OF TECHNOLOGY. 2.71/2.710 Optics Spring 14 Practice Problems Posted May 11, 2014 MASSACHUSETTS INSTITUTE OF TECHNOLOGY 2.71/2.710 Optics Spring 14 Practice Problems Posted May 11, 2014 1. (Pedrotti 13-21) A glass plate is sprayed with uniform opaque particles. When a distant point

More information

Project 4 Results http://www.cs.brown.edu/courses/cs129/results/proj4/jcmace/ http://www.cs.brown.edu/courses/cs129/results/proj4/damoreno/ http://www.cs.brown.edu/courses/csci1290/results/proj4/huag/

More information

Deconvolution , , Computational Photography Fall 2018, Lecture 12

Deconvolution , , Computational Photography Fall 2018, Lecture 12 Deconvolution http://graphics.cs.cmu.edu/courses/15-463 15-463, 15-663, 15-862 Computational Photography Fall 2018, Lecture 12 Course announcements Homework 3 is out. - Due October 12 th. - Any questions?

More information

CHARA AO Calibration Process

CHARA AO Calibration Process CHARA AO Calibration Process Judit Sturmann CHARA AO Project Overview Phase I. Under way WFS on telescopes used as tip-tilt detector Phase II. Not yet funded WFS and large DM in place of M4 on telescopes

More information

Optical Information Processing. Adolf W. Lohmann. Edited by Stefan Sinzinger. Ch>

Optical Information Processing. Adolf W. Lohmann. Edited by Stefan Sinzinger. Ch> Optical Information Processing Adolf W. Lohmann Edited by Stefan Sinzinger Ch> Universitätsverlag Ilmenau 2006 Contents Preface to the 2006 edition 13 Preface to the third edition 15 Preface volume 1 17

More information

Light-Field Database Creation and Depth Estimation

Light-Field Database Creation and Depth Estimation Light-Field Database Creation and Depth Estimation Abhilash Sunder Raj abhisr@stanford.edu Michael Lowney mlowney@stanford.edu Raj Shah shahraj@stanford.edu Abstract Light-field imaging research has been

More information

Holography as a tool for advanced learning of optics and photonics

Holography as a tool for advanced learning of optics and photonics Holography as a tool for advanced learning of optics and photonics Victor V. Dyomin, Igor G. Polovtsev, Alexey S. Olshukov Tomsk State University 36 Lenin Avenue, Tomsk, 634050, Russia Tel/fax: 7 3822

More information

Design of a digital holographic interferometer for the. ZaP Flow Z-Pinch

Design of a digital holographic interferometer for the. ZaP Flow Z-Pinch Design of a digital holographic interferometer for the M. P. Ross, U. Shumlak, R. P. Golingo, B. A. Nelson, S. D. Knecht, M. C. Hughes, R. J. Oberto University of Washington, Seattle, USA Abstract The

More information

WaveMaster IOL. Fast and accurate intraocular lens tester

WaveMaster IOL. Fast and accurate intraocular lens tester WaveMaster IOL Fast and accurate intraocular lens tester INTRAOCULAR LENS TESTER WaveMaster IOL Fast and accurate intraocular lens tester WaveMaster IOL is a new instrument providing real time analysis

More information

Dictionary Learning based Color Demosaicing for Plenoptic Cameras

Dictionary Learning based Color Demosaicing for Plenoptic Cameras Dictionary Learning based Color Demosaicing for Plenoptic Cameras Xiang Huang Northwestern University Evanston, IL, USA xianghuang@gmail.com Oliver Cossairt Northwestern University Evanston, IL, USA ollie@eecs.northwestern.edu

More information

Modeling and Synthesis of Aperture Effects in Cameras

Modeling and Synthesis of Aperture Effects in Cameras Modeling and Synthesis of Aperture Effects in Cameras Douglas Lanman, Ramesh Raskar, and Gabriel Taubin Computational Aesthetics 2008 20 June, 2008 1 Outline Introduction and Related Work Modeling Vignetting

More information

Reinterpretable Imager: Towards Variable Post-Capture Space, Angle and Time Resolution in Photography

Reinterpretable Imager: Towards Variable Post-Capture Space, Angle and Time Resolution in Photography Reinterpretable Imager: Towards Variable Post-Capture Space, Angle and Time Resolution in Photography The MIT Faculty has made this article openly available. Please share how this access benefits you.

More information

What will be on the midterm?

What will be on the midterm? What will be on the midterm? CS 178, Spring 2014 Marc Levoy Computer Science Department Stanford University General information 2 Monday, 7-9pm, Cubberly Auditorium (School of Edu) closed book, no notes

More information

Coded Aperture for Projector and Camera for Robust 3D measurement

Coded Aperture for Projector and Camera for Robust 3D measurement Coded Aperture for Projector and Camera for Robust 3D measurement Yuuki Horita Yuuki Matugano Hiroki Morinaga Hiroshi Kawasaki Satoshi Ono Makoto Kimura Yasuo Takane Abstract General active 3D measurement

More information

Computational Photography

Computational Photography Computational photography Computational Photography Digital Visual Effects Yung-Yu Chuang wikipedia: Computational photography h refers broadly to computational imaging techniques that enhance or extend

More information

Near-Invariant Blur for Depth and 2D Motion via Time-Varying Light Field Analysis

Near-Invariant Blur for Depth and 2D Motion via Time-Varying Light Field Analysis Near-Invariant Blur for Depth and 2D Motion via Time-Varying Light Field Analysis Yosuke Bando 1,2 Henry Holtzman 2 Ramesh Raskar 2 1 Toshiba Corporation 2 MIT Media Lab Defocus & Motion Blur PSF Depth

More information

ELECTRONIC HOLOGRAPHY

ELECTRONIC HOLOGRAPHY ELECTRONIC HOLOGRAPHY CCD-camera replaces film as the recording medium. Electronic holography is better suited than film-based holography to quantitative applications including: - phase microscopy - metrology

More information

WaveMaster IOL. Fast and Accurate Intraocular Lens Tester

WaveMaster IOL. Fast and Accurate Intraocular Lens Tester WaveMaster IOL Fast and Accurate Intraocular Lens Tester INTRAOCULAR LENS TESTER WaveMaster IOL Fast and accurate intraocular lens tester WaveMaster IOL is an instrument providing real time analysis of

More information

Metrology and Sensing

Metrology and Sensing Metrology and Sensing Lecture 10: Holography 2017-12-21 Herbert Gross Winter term 2017 www.iap.uni-jena.de 2 Preliminary Schedule No Date Subject Detailed Content 1 19.10. Introduction Introduction, optical

More information

Single-shot three-dimensional imaging of dilute atomic clouds

Single-shot three-dimensional imaging of dilute atomic clouds Calhoun: The NPS Institutional Archive Faculty and Researcher Publications Funded by Naval Postgraduate School 2014 Single-shot three-dimensional imaging of dilute atomic clouds Sakmann, Kaspar http://hdl.handle.net/10945/52399

More information

Computational Photography Introduction

Computational Photography Introduction Computational Photography Introduction Jongmin Baek CS 478 Lecture Jan 9, 2012 Background Sales of digital cameras surpassed sales of film cameras in 2004. Digital cameras are cool Free film Instant display

More information

J. C. Wyant Fall, 2012 Optics Optical Testing and Testing Instrumentation

J. C. Wyant Fall, 2012 Optics Optical Testing and Testing Instrumentation J. C. Wyant Fall, 2012 Optics 513 - Optical Testing and Testing Instrumentation Introduction 1. Measurement of Paraxial Properties of Optical Systems 1.1 Thin Lenses 1.1.1 Measurements Based on Image Equation

More information

Removal of Glare Caused by Water Droplets

Removal of Glare Caused by Water Droplets 2009 Conference for Visual Media Production Removal of Glare Caused by Water Droplets Takenori Hara 1, Hideo Saito 2, Takeo Kanade 3 1 Dai Nippon Printing, Japan hara-t6@mail.dnp.co.jp 2 Keio University,

More information

DEPTH FUSED FROM INTENSITY RANGE AND BLUR ESTIMATION FOR LIGHT-FIELD CAMERAS. Yatong Xu, Xin Jin and Qionghai Dai

DEPTH FUSED FROM INTENSITY RANGE AND BLUR ESTIMATION FOR LIGHT-FIELD CAMERAS. Yatong Xu, Xin Jin and Qionghai Dai DEPTH FUSED FROM INTENSITY RANGE AND BLUR ESTIMATION FOR LIGHT-FIELD CAMERAS Yatong Xu, Xin Jin and Qionghai Dai Shenhen Key Lab of Broadband Network and Multimedia, Graduate School at Shenhen, Tsinghua

More information

ECEN 4606, UNDERGRADUATE OPTICS LAB

ECEN 4606, UNDERGRADUATE OPTICS LAB ECEN 4606, UNDERGRADUATE OPTICS LAB Lab 2: Imaging 1 the Telescope Original Version: Prof. McLeod SUMMARY: In this lab you will become familiar with the use of one or more lenses to create images of distant

More information

Lenses, exposure, and (de)focus

Lenses, exposure, and (de)focus Lenses, exposure, and (de)focus http://graphics.cs.cmu.edu/courses/15-463 15-463, 15-663, 15-862 Computational Photography Fall 2017, Lecture 15 Course announcements Homework 4 is out. - Due October 26

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Mechanical Engineering Department. 2.71/2.710 Final Exam. May 21, Duration: 3 hours (9 am-12 noon)

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Mechanical Engineering Department. 2.71/2.710 Final Exam. May 21, Duration: 3 hours (9 am-12 noon) MASSACHUSETTS INSTITUTE OF TECHNOLOGY Mechanical Engineering Department 2.71/2.710 Final Exam May 21, 2013 Duration: 3 hours (9 am-12 noon) CLOSED BOOK Total pages: 5 Name: PLEASE RETURN THIS BOOKLET WITH

More information

Some of the important topics needed to be addressed in a successful lens design project (R.R. Shannon: The Art and Science of Optical Design)

Some of the important topics needed to be addressed in a successful lens design project (R.R. Shannon: The Art and Science of Optical Design) Lens design Some of the important topics needed to be addressed in a successful lens design project (R.R. Shannon: The Art and Science of Optical Design) Focal length (f) Field angle or field size F/number

More information

GEOMETRICAL OPTICS AND OPTICAL DESIGN

GEOMETRICAL OPTICS AND OPTICAL DESIGN GEOMETRICAL OPTICS AND OPTICAL DESIGN Pantazis Mouroulis Associate Professor Center for Imaging Science Rochester Institute of Technology John Macdonald Senior Lecturer Physics Department University of

More information

( ) Deriving the Lens Transmittance Function. Thin lens transmission is given by a phase with unit magnitude.

( ) Deriving the Lens Transmittance Function. Thin lens transmission is given by a phase with unit magnitude. Deriving the Lens Transmittance Function Thin lens transmission is given by a phase with unit magnitude. t(x, y) = exp[ jk o ]exp[ jk(n 1) (x, y) ] Find the thickness function for left half of the lens

More information

Observational Astronomy

Observational Astronomy Observational Astronomy Instruments The telescope- instruments combination forms a tightly coupled system: Telescope = collecting photons and forming an image Instruments = registering and analyzing the

More information

Image Formation and Camera Design

Image Formation and Camera Design Image Formation and Camera Design Spring 2003 CMSC 426 Jan Neumann 2/20/03 Light is all around us! From London & Upton, Photography Conventional camera design... Ken Kay, 1969 in Light & Film, TimeLife

More information

ELEC Dr Reji Mathew Electrical Engineering UNSW

ELEC Dr Reji Mathew Electrical Engineering UNSW ELEC 4622 Dr Reji Mathew Electrical Engineering UNSW Filter Design Circularly symmetric 2-D low-pass filter Pass-band radial frequency: ω p Stop-band radial frequency: ω s 1 δ p Pass-band tolerances: δ

More information

Explanation of Aberration and Wavefront

Explanation of Aberration and Wavefront Explanation of Aberration and Wavefront 1. What Causes Blur? 2. What is? 4. What is wavefront? 5. Hartmann-Shack Aberrometer 6. Adoption of wavefront technology David Oh 1. What Causes Blur? 2. What is?

More information

Implementation of Adaptive Coded Aperture Imaging using a Digital Micro-Mirror Device for Defocus Deblurring

Implementation of Adaptive Coded Aperture Imaging using a Digital Micro-Mirror Device for Defocus Deblurring Implementation of Adaptive Coded Aperture Imaging using a Digital Micro-Mirror Device for Defocus Deblurring Ashill Chiranjan and Bernardt Duvenhage Defence, Peace, Safety and Security Council for Scientific

More information

Limitations of lenses

Limitations of lenses Limitations of lenses CS 448A, Winter 2010 Marc Levoy Computer Science Department Stanford University Outline misfocus & depth of field aberrations & distortion veiling glare flare and ghost images vignetting

More information

Demosaicing and Denoising on Simulated Light Field Images

Demosaicing and Denoising on Simulated Light Field Images Demosaicing and Denoising on Simulated Light Field Images Trisha Lian Stanford University tlian@stanford.edu Kyle Chiang Stanford University kchiang@stanford.edu Abstract Light field cameras use an array

More information

Sensors and Sensing Cameras and Camera Calibration

Sensors and Sensing Cameras and Camera Calibration Sensors and Sensing Cameras and Camera Calibration Todor Stoyanov Mobile Robotics and Olfaction Lab Center for Applied Autonomous Sensor Systems Örebro University, Sweden todor.stoyanov@oru.se 20.11.2014

More information

The Camera : Computational Photography Alexei Efros, CMU, Fall 2008

The Camera : Computational Photography Alexei Efros, CMU, Fall 2008 The Camera 15-463: Computational Photography Alexei Efros, CMU, Fall 2008 How do we see the world? object film Let s design a camera Idea 1: put a piece of film in front of an object Do we get a reasonable

More information

High Resolution Spectral Video Capture & Computational Photography Xun Cao ( 曹汛 )

High Resolution Spectral Video Capture & Computational Photography Xun Cao ( 曹汛 ) High Resolution Spectral Video Capture & Computational Photography Xun Cao ( 曹汛 ) School of Electronic Science & Engineering Nanjing University caoxun@nju.edu.cn Dec 30th, 2015 Computational Photography

More information

Spatial Resolution and Contrast of a Focused Diffractive Plenoptic Camera

Spatial Resolution and Contrast of a Focused Diffractive Plenoptic Camera Air Force Institute of Technology AFIT Scholar Theses and Dissertations 3-23-2018 Spatial Resolution and Contrast of a Focused Diffractive Plenoptic Camera Carlos D. Diaz Follow this and additional works

More information

Coded Aperture Pairs for Depth from Defocus

Coded Aperture Pairs for Depth from Defocus Coded Aperture Pairs for Depth from Defocus Changyin Zhou Columbia University New York City, U.S. changyin@cs.columbia.edu Stephen Lin Microsoft Research Asia Beijing, P.R. China stevelin@microsoft.com

More information

Short-course Compressive Sensing of Videos

Short-course Compressive Sensing of Videos Short-course Compressive Sensing of Videos Venue CVPR 2012, Providence, RI, USA June 16, 2012 Richard G. Baraniuk Mohit Gupta Aswin C. Sankaranarayanan Ashok Veeraraghavan Tutorial Outline Time Presenter

More information

Optical Signal Processing

Optical Signal Processing Optical Signal Processing ANTHONY VANDERLUGT North Carolina State University Raleigh, North Carolina A Wiley-Interscience Publication John Wiley & Sons, Inc. New York / Chichester / Brisbane / Toronto

More information

Department of Mechanical and Aerospace Engineering, Princeton University Department of Astrophysical Sciences, Princeton University ABSTRACT

Department of Mechanical and Aerospace Engineering, Princeton University Department of Astrophysical Sciences, Princeton University ABSTRACT Phase and Amplitude Control Ability using Spatial Light Modulators and Zero Path Length Difference Michelson Interferometer Michael G. Littman, Michael Carr, Jim Leighton, Ezekiel Burke, David Spergel

More information

Why is There a Black Dot when Defocus = 1λ?

Why is There a Black Dot when Defocus = 1λ? Why is There a Black Dot when Defocus = 1λ? W = W 020 = a 020 ρ 2 When a 020 = 1λ Sag of the wavefront at full aperture (ρ = 1) = 1λ Sag of the wavefront at ρ = 0.707 = 0.5λ Area of the pupil from ρ =

More information

Computational Illumination

Computational Illumination Computational Illumination Course WebPage : http://www.merl.com/people/raskar/photo/ Ramesh Raskar Mitsubishi Electric Research Labs Ramesh Raskar, Computational Illumination Computational Illumination

More information

The Camera : Computational Photography Alexei Efros, CMU, Fall 2005

The Camera : Computational Photography Alexei Efros, CMU, Fall 2005 The Camera 15-463: Computational Photography Alexei Efros, CMU, Fall 2005 How do we see the world? object film Let s design a camera Idea 1: put a piece of film in front of an object Do we get a reasonable

More information

AgilEye Manual Version 2.0 February 28, 2007

AgilEye Manual Version 2.0 February 28, 2007 AgilEye Manual Version 2.0 February 28, 2007 1717 Louisiana NE Suite 202 Albuquerque, NM 87110 (505) 268-4742 support@agiloptics.com 2 (505) 268-4742 v. 2.0 February 07, 2007 3 Introduction AgilEye Wavefront

More information

La photographie numérique. Frank NIELSEN Lundi 7 Juin 2010

La photographie numérique. Frank NIELSEN Lundi 7 Juin 2010 La photographie numérique Frank NIELSEN Lundi 7 Juin 2010 1 Le Monde digital Key benefits of the analog2digital paradigm shift? Dissociate contents from support : binarize Universal player (CPU, Turing

More information

arxiv: v2 [cs.gr] 7 Dec 2015

arxiv: v2 [cs.gr] 7 Dec 2015 Light-Field Microscopy with a Consumer Light-Field Camera Lois Mignard-Debise INRIA, LP2N Bordeaux, France http://manao.inria.fr/perso/ lmignard/ Ivo Ihrke INRIA, LP2N Bordeaux, France arxiv:1508.03590v2

More information

Diffraction, Fourier Optics and Imaging

Diffraction, Fourier Optics and Imaging 1 Diffraction, Fourier Optics and Imaging 1.1 INTRODUCTION When wave fields pass through obstacles, their behavior cannot be simply described in terms of rays. For example, when a plane wave passes through

More information

Digital Photography. Visual Imaging in the Electronic Age Lecture #8 Donald P. Greenberg September 14, 2017

Digital Photography. Visual Imaging in the Electronic Age Lecture #8 Donald P. Greenberg September 14, 2017 Digital Photography Visual Imaging in the Electronic Age Lecture #8 Donald P. Greenberg September 14, 2017 History of Photography Ancient Camera Obscura through pinhole 16 th - 17 th Century Camera Obscura

More information

Ron Liu OPTI521-Introductory Optomechanical Engineering December 7, 2009

Ron Liu OPTI521-Introductory Optomechanical Engineering December 7, 2009 Synopsis of METHOD AND APPARATUS FOR IMPROVING VISION AND THE RESOLUTION OF RETINAL IMAGES by David R. Williams and Junzhong Liang from the US Patent Number: 5,777,719 issued in July 7, 1998 Ron Liu OPTI521-Introductory

More information

Agenda. Fusion and Reconstruction. Image Fusion & Reconstruction. Image Fusion & Reconstruction. Dr. Yossi Rubner.

Agenda. Fusion and Reconstruction. Image Fusion & Reconstruction. Image Fusion & Reconstruction. Dr. Yossi Rubner. Fusion and Reconstruction Dr. Yossi Rubner yossi@rubner.co.il Some slides stolen from: Jack Tumblin 1 Agenda We ve seen Panorama (from different FOV) Super-resolution (from low-res) HDR (from different

More information

Time-Lapse Light Field Photography With a 7 DoF Arm

Time-Lapse Light Field Photography With a 7 DoF Arm Time-Lapse Light Field Photography With a 7 DoF Arm John Oberlin and Stefanie Tellex Abstract A photograph taken by a conventional camera captures the average intensity of light at each pixel, discarding

More information

Use of Computer Generated Holograms for Testing Aspheric Optics

Use of Computer Generated Holograms for Testing Aspheric Optics Use of Computer Generated Holograms for Testing Aspheric Optics James H. Burge and James C. Wyant Optical Sciences Center, University of Arizona, Tucson, AZ 85721 http://www.optics.arizona.edu/jcwyant,

More information

Multi-view Image Restoration From Plenoptic Raw Images

Multi-view Image Restoration From Plenoptic Raw Images Multi-view Image Restoration From Plenoptic Raw Images Shan Xu 1, Zhi-Liang Zhou 2 and Nicholas Devaney 1 School of Physics, National University of Ireland, Galway 1 Academy of Opto-electronics, Chinese

More information

Applications of Optics

Applications of Optics Nicholas J. Giordano www.cengage.com/physics/giordano Chapter 26 Applications of Optics Marilyn Akins, PhD Broome Community College Applications of Optics Many devices are based on the principles of optics

More information

Coding & Signal Processing for Holographic Data Storage. Vijayakumar Bhagavatula

Coding & Signal Processing for Holographic Data Storage. Vijayakumar Bhagavatula Coding & Signal Processing for Holographic Data Storage Vijayakumar Bhagavatula Acknowledgements Venkatesh Vadde Mehmet Keskinoz Sheida Nabavi Lakshmi Ramamoorthy Kevin Curtis, Adrian Hill & Mark Ayres

More information

Angle Sensitive Imaging: A New Paradigm for Light Field Imaging

Angle Sensitive Imaging: A New Paradigm for Light Field Imaging Angle Sensitive Imaging: A New Paradigm for Light Field Imaging VIGIL VARGHESE School of Electrical and Electronic Engineering A thesis submitted to the Nanyang Technological University in partial fulfillment

More information

New Directions in Imaging Sensors Ravi Athale, MITRE Corporation OIDA Annual Forum 19 November 2008

New Directions in Imaging Sensors Ravi Athale, MITRE Corporation OIDA Annual Forum 19 November 2008 New Directions in Imaging Sensors Ravi Athale, MITRE Corporation OIDA Annual Forum 19 November 2008 We live in xxxx age information, biotech, nano, neurotech, quantum Regardless of the answer, we live

More information

Aberrations and adaptive optics for biomedical microscopes

Aberrations and adaptive optics for biomedical microscopes Aberrations and adaptive optics for biomedical microscopes Martin Booth Department of Engineering Science And Centre for Neural Circuits and Behaviour University of Oxford Outline Rays, wave fronts and

More information

Optical Design of. Microscopes. George H. Seward. Tutorial Texts in Optical Engineering Volume TT88. SPIE PRESS Bellingham, Washington USA

Optical Design of. Microscopes. George H. Seward. Tutorial Texts in Optical Engineering Volume TT88. SPIE PRESS Bellingham, Washington USA Optical Design of Microscopes George H. Seward Tutorial Texts in Optical Engineering Volume TT88 SPIE PRESS Bellingham, Washington USA Preface xiii Chapter 1 Optical Design Concepts /1 1.1 A Value Proposition

More information

Image Formation. Light from distant things. Geometrical optics. Pinhole camera. Chapter 36

Image Formation. Light from distant things. Geometrical optics. Pinhole camera. Chapter 36 Light from distant things Chapter 36 We learn about a distant thing from the light it generates or redirects. The lenses in our eyes create images of objects our brains can process. This chapter concerns

More information

Stereoscopic Hologram

Stereoscopic Hologram Stereoscopic Hologram Joonku Hahn Kyungpook National University Outline: 1. Introduction - Basic structure of holographic display - Wigner distribution function 2. Design of Stereoscopic Hologram - Optical

More information

Aberrations of a lens

Aberrations of a lens Aberrations of a lens 1. What are aberrations? A lens made of a uniform glass with spherical surfaces cannot form perfect images. Spherical aberration is a prominent image defect for a point source on

More information