A 3D Multi-Aperture Image Sensor Architecture

Similar documents
A 3MPixel Multi-Aperture Image Sensor with 0.7µm Pixels in 0.11µm CMOS

Integrated Multi-Aperture Imaging

Optical Flow Estimation. Using High Frame Rate Sequences

Colorado School of Mines. Computer Vision. Professor William Hoff Dept of Electrical Engineering &Computer Science.

EE 392B: Course Introduction

Imaging Instruments (part I)

Digital camera. Sensor. Memory card. Circuit board

ME 6406 MACHINE VISION. Georgia Institute of Technology

Introduction to Computer Vision

OPAL Optical Profiling of the Atmospheric Limb

Computer Vision. The Pinhole Camera Model

A 3 Mpixel ROIC with 10 m Pixel Pitch and 120 Hz Frame Rate Digital Output

pco.edge 4.2 LT 0.8 electrons 2048 x 2048 pixel 40 fps up to :1 up to 82 % pco. low noise high resolution high speed high dynamic range

Performance Comparison of Spectrometers Featuring On-Axis and Off-Axis Grating Rotation

Charged Coupled Device (CCD) S.Vidhya

Computational Sensors

High Definition 10µm pitch InGaAs detector with Asynchronous Laser Pulse Detection mode

Bias errors in PIV: the pixel locking effect revisited.

LENSES. INEL 6088 Computer Vision

Digital Imaging Rochester Institute of Technology

Techniques for Pixel Level Analog to Digital Conversion

ABSTRACT. Section I Overview of the µdss

Photons and solid state detection

Integral 3-D Television Using a 2000-Scanning Line Video System

Geometry of Aerial Photographs

Adaptive Optics for LIGO

Projection. Announcements. Müller-Lyer Illusion. Image formation. Readings Nalwa 2.1

Exercise questions for Machine vision

CubeSat Camera CCAM : A Low Cost Imaging System for CubeSat Platforms

EE-527: MicroFabrication

Light field sensing. Marc Levoy. Computer Science Department Stanford University

Spectral Imaging with the Opterra Multipoint Scanning Confocal

A 0.18mm CMOS 10-6 lux Bioluminescence Detection System-on-Chip

Telescopes and their configurations. Quick review at the GO level

IMAGE FORMATION. Light source properties. Sensor characteristics Surface. Surface reflectance properties. Optics

Vision. The eye. Image formation. Eye defects & corrective lenses. Visual acuity. Colour vision. Lecture 3.5

Solution Set #2

MS260i 1/4 M IMAGING SPECTROGRAPHS

Design of Temporally Dithered Codes for Increased Depth of Field in Structured Light Systems

Capturing Light. The Light Field. Grayscale Snapshot 12/1/16. P(q, f)

Laser and LED retina hazard assessment with an eye simulator. Arie Amitzi and Menachem Margaliot Soreq NRC Yavne 81800, Israel

Putting It All Together: Computer Architecture and the Digital Camera

How does prism technology help to achieve superior color image quality?

Advanced Optical Line Scanners for Web Inspection in Vacuum Processes Tichawa Vision GmbH

Introduction to DSP ECE-S352 Fall Quarter 2000 Matlab Project 1

Camera Selection Criteria. Richard Crisp May 25, 2011

FDTD Analysis of Readout Characteristics in a near-field MAMMOS recording system. Matthew Manfredonia Paul Nutter & David Wright

Hyper-spectral, UHD imaging NANO-SAT formations or HAPS to detect, identify, geolocate and track; CBRN gases, fuel vapors and other substances

ABOUT RESOLUTION. pco.knowledge base

CPSC 4040/6040 Computer Graphics Images. Joshua Levine

Lecture Notes 10 Image Sensor Optics. Imaging optics. Pixel optics. Microlens

General Imaging System

Rad-icon Imaging Corp A Division of DALSA Corporation

A 1Mjot 1040fps 0.22e-rms Stacked BSI Quanta Image Sensor with Cluster-Parallel Readout

Optical Signal Processing

Lecture 18: Light field cameras. (plenoptic cameras) Visual Computing Systems CMU , Fall 2013

PHYSICS. Chapter 35 Lecture FOR SCIENTISTS AND ENGINEERS A STRATEGIC APPROACH 4/E RANDALL D. KNIGHT

VC 16/17 TP2 Image Formation

Oriel MS260i TM 1/4 m Imaging Spectrograph

Parallel Mode Confocal System for Wafer Bump Inspection

Cameras. CSE 455, Winter 2010 January 25, 2010

Projection. Readings. Szeliski 2.1. Wednesday, October 23, 13

A new Photon Counting Detector: Intensified CMOS- APS

VC 14/15 TP2 Image Formation

A CMOS Image Sensor with Ultra Wide Dynamic Range Floating-Point Pixel-Level ADC

IT FR R TDI CCD Image Sensor

CCD Requirements for Digital Photography

VC 11/12 T2 Image Formation

Figure 7 Dynamic range expansion of Shack- Hartmann sensor using a spatial-light modulator

A LATERAL SENSOR FOR THE ALIGNMENT OF TWO FORMATION-FLYING SATELLITES

Opto Engineering S.r.l.

Unit 1: Image Formation

Aptina MT9P111 5 Megapixel, 1/4 Inch Optical Format, System-on-Chip (SoC) CMOS Image Sensor

Image and Multidimensional Signal Processing

Astronomical Cameras

Digital Camera Technologies for Scientific Bio-Imaging. Part 2: Sampling and Signal

F-number sequence. a change of f-number to the next in the sequence corresponds to a factor of 2 change in light intensity,

Image acquisition. In both cases, the digital sensing element is one of the following: Line array Area array. Single sensor

Building a Real Camera. Slides Credit: Svetlana Lazebnik

Characterisation of a CMOS Charge Transfer Device for TDI Imaging

DIMENSIONAL MEASUREMENT OF MICRO LENS ARRAY WITH 3D PROFILOMETRY

A 1.3 Megapixel CMOS Imager Designed for Digital Still Cameras

Lenses, exposure, and (de)focus

product overview pco.edge family the most versatile scmos camera portfolio on the market pioneer in scmos image sensor technology

Light gathering Power: Magnification with eyepiece:

Selecting an image sensor for the EJSM VIS/NIR camera systems

Projection. Projection. Image formation. Müller-Lyer Illusion. Readings. Readings. Let s design a camera. Szeliski 2.1. Szeliski 2.

Acquisition. Some slides from: Yung-Yu Chuang (DigiVfx) Jan Neumann, Pat Hanrahan, Alexei Efros

CMOS MT9V034 Camera Module 1/3-Inch 0.36MP Monochrome Module Datasheet

IN RECENT years, we have often seen three-dimensional

A new Photon Counting Detector: Intensified CMOS- APS

A Pin-Hole Projection System: Status

e2v Launches New Onyx 1.3M for Premium Performance in Low Light Conditions

( ) Deriving the Lens Transmittance Function. Thin lens transmission is given by a phase with unit magnitude.

Section 3. Imaging With A Thin Lens

UXGA CMOS Image Sensor

Automotive Image Sensors

Development of a Low-order Adaptive Optics System at Udaipur Solar Observatory

Large scale rapid access holographic memory. Geoffrey W. Burr, Xin An, Fai H. Mokt, and Demetri Psaltis. Department of Electrical Engineering

Transcription:

A 3D Multi-Aperture Image Sensor Architecture Keith Fife, Abbas El Gamal and H.-S. Philip Wong Department of Electrical Engineering Stanford University

Outline Multi-Aperture system overview Sensor architecture and operation Image extraction Calculation of depth and resolution Sensor and System parameters Circuit Implementation A 3D Multi-Aperture Image Sensor Architecture p. 2/18

Multi-Aperture System Scene focused via objective lens above detector plane Re-imaged via local optics onto disjoint arrays Arrays have overlapping fields of view Image is formed using digital signal processing Objective Lens Focal Plane Multiple Apertures Array of Small FPs A 3D Multi-Aperture Image Sensor Architecture p. 3/18

Why Multi-Aperture Imaging Capture depth information Reduce requirements of objective lens (cheaper optics) Achieve better color separation (less crosstalk) Redundant data allows for manufacturing defect correction Facilitate new circuit design architectures Benefit from pixel scaling A 3D Multi-Aperture Image Sensor Architecture p. 4/18

Architecture The sensor contains an m n array of pixel groups PSfrag replacements Gi Li SEQUENCER - n rows G1 L1 G0 L0 ADC ADC ADC ADC ADC ROW BUFFER - m columns Dout A 3D Multi-Aperture Image Sensor Architecture p. 5/18

Traditional vs Multi-Aperture Traditional optical configuration Multi-aperture optical configuration A 3D Multi-Aperture Image Sensor Architecture p. 6/18

Local Optics Local optics and Color Filter Array (CFA) can be built with CMOS Image Sensor (CIS) process A 3D Multi-Aperture Image Sensor Architecture p. 7/18

Multi-Aperture Color System Spectral separation by aperture No color contamination from neighboring pixels Facilitates the use of large dielectric stack height which allows high logic density Objective Lens Focal Plane Multiple Apertures Array of Small FPs A 3D Multi-Aperture Image Sensor Architecture p. 8/18

Projected Color Channels Color channels only overlap in the space above the detector A 3D Multi-Aperture Image Sensor Architecture p. 9/18

2D and 3D Image Extraction Depth information is obtained from the disparity between apertures. Object movement translates to lateral displacement between corresponding points imaged by disjoint arrays. Solving the correspondence problem is eased by using several local apertures. The 2D image is formed by solving for the local correspondence and integrating the result across the sensor. A 3D Multi-Aperture Image Sensor Architecture p. 10/18

Virtual Aperture Views Chief rays for a pair of apertures Left virtual objective aperture Right virtual objective aperture Virtual apertures for stereo view A 3D Multi-Aperture Image Sensor Architecture p. 11/18

Depth Calculations By the geometry of the local optics and focal plane, C/L = D 0 / Using the lens law for A as a function of B A and making the substitution B = E C = B 0 + C 0 C, PSfrag replacements A = 1 f 1 «1 = B «1 1 f 1 B 0 + C 0 C Solving for A in terms of with M = B/A f B and N = D/C gives the depth equation, C A =» 1 1 f 1 (M 0 + 1)f + D 0 /N 0 D 0 L/ g /2 L /2 D A 3D Multi-Aperture Image Sensor Architecture p. 12/18

Depth Resolution Decreases with Distance The amount of depth information available falls off with the square of the object distance. Solving for a measured displacement gives, = D 0 L (M 0 M)f + D 0 /N 0. As M decreases, rapidly approaches its limit of D 0 L/(M 0 f + D 0 /N 0 ). The rate of change in with A, / A f 2 A 2 DL C 2 / A M 2 N 2 L D. A 3D Multi-Aperture Image Sensor Architecture p. 13/18

Spatial Resolution and Pixel Size Spatial resolution is limited to the total number of pixels mnk 2. In order to achieve redundancy, the local magnification factor is set to N < 1. Spatial resolution is reduced by 1/N 2. The total recoverable resolution is mnk 2 N 2 Example: A 16 16 array of 0.5µm pixels with a magnification factor of N 0 = 1/4 produces a maximum resolution 16 times greater than the aperture count and 16 times lower than the pixel count. A 3D Multi-Aperture Image Sensor Architecture p. 14/18

Spot Size Comparison The minimum spot size for a diffraction limited system is approximately λ/na. The minimum useful pixel pitch is half the spot size using Rayleigh criterion. Disparity from a Multi-Aperture system gives displacement which can be smaller than diffraction limit. λ/na λ/2na < λ/2na A 3D Multi-Aperture Image Sensor Architecture p. 15/18

Pixel Structure Single aperture array with local readout Architecture enables global exposure PSfrag replacements VSHIFT TX RT CCD BUFFER RS HSHIFT CB CT A 3D Multi-Aperture Image Sensor Architecture p. 16/18

acements Capture and Readout Sequence Frame timing T int reset integration transfer readout T out vblank transfer T frame acements Row timing V H RT TX RS S1 S2 S1 S2 S1 S2 S1 S2 S1 S2 A 3D Multi-Aperture Image Sensor Architecture p. 17/18

Conclusion Depth map is extracted by solving the correspondence problem between multiple views of the same points in the primary focal plane. The spatial resolution of the system is shown to be greater than the aperture count itself and governed by the magnification of the local optics and pixel size. The amount of depth resolution available increases with decreasing pixel size while the 2D spatial resolution remains limited. The sensor architecture may be useful in improving the performance of color imaging by employing a per-aperture color filter. A 3D Multi-Aperture Image Sensor Architecture p. 18/18