The End of Big Optics in Photography

Similar documents
th e Art o f Ci n e m a to g ra p h y

Topic 9 - Sensors Within

brief history of photography foveon X3 imager technology description

Digital Cameras vs Film: the Collapse of Film Photography Can Your Digital Camera reach Film Photography Performance? Film photography started in

Optical basics for machine vision systems. Lars Fermum Chief instructor STEMMER IMAGING GmbH

IMAGE SENSOR SOLUTIONS. KAC-96-1/5" Lens Kit. KODAK KAC-96-1/5" Lens Kit. for use with the KODAK CMOS Image Sensors. November 2004 Revision 2

Lenses, exposure, and (de)focus

Digital Cameras vs Film: the Collapse of Film Photography Can Your Digital Camera reach Film Photography Performance? Film photography started in

Feasibility and Design for the Simplex Electronic Telescope. Brian Dodson

Criteria for Optical Systems: Optical Path Difference How do we determine the quality of a lens system? Several criteria used in optical design

Charged Coupled Device (CCD) S.Vidhya

Consumer digital CCD cameras

Spectral and Polarization Configuration Guide for MS Series 3-CCD Cameras

Advanced Camera and Image Sensor Technology. Steve Kinney Imaging Professional Camera Link Chairman

High Performance Imaging Using Large Camera Arrays

Megapixels and more. The basics of image processing in digital cameras. Construction of a digital camera

MODULE No. 34: Digital Photography and Enhancement

Adaptive Optics for LIGO

digital film technology Resolution Matters what's in a pattern white paper standing the test of time

Two Fundamental Properties of a Telescope

Astronomical Cameras

TRIPLE CAMERAS: ARE THREE BETTER THAN TWO?

How to Choose a Machine Vision Camera for Your Application.

Getting Unlimited Digital Resolution

Nikon AF-S mm f/4e TC1.4 FL ED VR Lens Review. by E.J. Peiker

Wavefront Sensing In Other Disciplines. 15 February 2003 Jerry Nelson, UCSC Wavefront Congress

Nikon 200mm f/4d ED-IF AF Micro Nikkor (Tested)

Biometrics and Fingerprint Authentication Technical White Paper

Paper Synopsis. Xiaoyin Zhu Nov 5, 2012 OPTI 521

Introduction. Lighting

Basic principles of photography. David Capel 346B IST

The Importance of Wavelengths on Optical Designs

Chapter 2-Digital Components

9/19/16. A Closer Look. Danae Wolfe. What We ll Cover. Basics of photography & your camera. Technical. Macro & close-up techniques.

Sharpness, Resolution and Interpolation

Image acquisition. In both cases, the digital sensing element is one of the following: Line array Area array. Single sensor

Synopsis of paper. Optomechanical design of multiscale gigapixel digital camera. Hui S. Son, Adam Johnson, et val.

7x P/N C1601. General Description

GETTING THE BEST EQUIPMENT FOR YOUR NEEDS AND BUDGET by PAUL BRANCHFLOWER

LENS ZOOM-SWIR 7x P/N C0628

Improved sensitivity high-definition interline CCD using the KODAK TRUESENSE Color Filter Pattern

Vladimir Vassiliev UCLA

SEE MORE, SMARTER. We design the most advanced vision systems to bring humanity to any device.

University Of Lübeck ISNM Presented by: Omar A. Hanoun

Beacon Island Report / Notes

ACTIVE ALIGNMENT WILL BE CRITICAL FOR FUTURE OPTICAL SYSTEMS MANUFACTURING

Lecture Outline Chapter 27. Physics, 4 th Edition James S. Walker. Copyright 2010 Pearson Education, Inc.

Astrophotography. An intro to night sky photography

More Imaging Luc De Mey - CEO - CMOSIS SA

Interpixel crosstalk in a 3D-integrated active pixel sensor for x-ray detection

A Digital Camera Glossary. Ashley Rodriguez, Charlie Serrano, Luis Martinez, Anderson Guatemala PERIOD 6

Compact Dual Field-of-View Telescope for Small Satellite Payloads

Image Formation: Camera Model

The Imaging Chain in Optical Astronomy

The Imaging Chain in Optical Astronomy

Design of a digital holographic interferometer for the. ZaP Flow Z-Pinch

Light gathering Power: Magnification with eyepiece:

Optical Performance of Nikon F-Mount Lenses. Landon Carter May 11, Measurement and Instrumentation

MEASURING HEAD-UP DISPLAYS FROM 2D TO AR: SYSTEM BENEFITS & DEMONSTRATION Presented By Matt Scholz November 28, 2018

with your guide, Kim Wolhuter Botswana 2018 Catalogue Gear up for the 1 perfect safari

Cameras As Computing Systems

IDEAS+ WP3520 Calibration and data quality toolbox. July 2016 Steve Mackin James Warner

How does prism technology help to achieve superior color image quality?

Proposed Adaptive Optics system for Vainu Bappu Telescope

Advanced 3D Optical Profiler using Grasshopper3 USB3 Vision camera

Gigapixel Television

TAKING GREAT PICTURES. A Modest Introduction

Evaluating Commercial Scanners for Astronomical Images. The underlying technology of the scanners: Pixel sizes:

Introduction to Photography - Lesson 1

This has given you a good introduction to the world of photography, however there are other important and fundamental camera functions and skills

Intro to Digital SLR and ILC Photography Week 1 The Camera Body

Best Lenses For Shooting Video On Canon 5d

THREE DIMENSIONAL FLASH LADAR FOCAL PLANES AND TIME DEPENDENT IMAGING

Telescope Basics by Keith Beadman

Image sensor combining the best of different worlds

Advanced Optical Line Scanners for Web Inspection in Vacuum Processes Tichawa Vision GmbH

Imaging Optics Fundamentals

LENS OB-SWIR500/7 P/N C0615

Nikon Instruments Europe

Your objective: maximum control, maximum manageability

digital cameras essential skills Mark Galer

APPLICATIONS FOR TELECENTRIC LIGHTING

Machine Vision: Image Formation

Digital Matrix User s Guide

F-number sequence. a change of f-number to the next in the sequence corresponds to a factor of 2 change in light intensity,

DSLR Cameras have a wide variety of lenses that can be used.

Heads Up and Near Eye Display!

Aperture. The lens opening that allows more, or less light onto the sensor formed by a diaphragm inside the actual lens.

The first uncooled (no thermal) MWIR FPA monolithically integrated with a Si-CMOS ROIC: a 80x80 VPD PbSe FPA

Wide-Field Microscopy using Microcamera Arrays

Lecture 29: Image Sensors. Computer Graphics and Imaging UC Berkeley CS184/284A

Lecture 30: Image Sensors (Cont) Computer Graphics and Imaging UC Berkeley CS184/284A

Making the right lens choice All images Paul Hazell

CAMERA GRAND PRIX 2010 JAPAN

Digital Camera Sensors

Optical Design of the SuMIRe PFS Spectrograph

Unit 2: Optics Part 2

BIG PIXELS VS. SMALL PIXELS THE OPTICAL BOTTLENECK. Gregory Hollows Edmund Optics

Dynamic Optically Multiplexed Imaging

Defense Technical Information Center Compilation Part Notice

Transcription:

The End of Big Optics in Photography Introduction By M.S. Whalen Applied Color Science, inc. Aug. 2015 In the middle of the last century, observational astronomy had hit a wall. Larger telescopes were needed to see farther out into the universe, but massive glass mirrors like the one used in the 200-inch Hale telescope had become technically and financially impractical to build and maintain. The solution to this problem was the multiple mirror telescope, or MMT. The primary reflecting mirror was split into an array of smaller mirrors and a computer system was used to control the shape and position of each mirror element. The 400-inch, 36-mirror element Keck telescopes in Hawaii are testaments to the success of this approach. Today, digital photography is quickly approaching a similar wall. Rapid advances in digital image sensor technology and the changes in consumer habits for digital photography in the last several years have begun to outstrip the capabilities of conventional optics. Consider these facts: 1.) Camera optics are lagging behind electronics. Image sensor manufacturers are now able to produce pixel arrays with an interpixel spacing (or pixel pitch) of 1um. If someone were to make a true Super 35 (4 perf) image sensor using this pixel pitch, it would be a pixel array of 24,890 (H) X 18,660 (V), or approximately 500 Mpixels! The spatial resolving power of such a sensor expressed in 50% MTF values could far exceed 200 lp/mm. By comparison, most ARRI Prime lenses used in cinematography quote MTF values at 10 lp/mm. 1

2.) Big optics are expensive. Purchasing high end professional camera lenses requires a major financial outlay. Here s some examples from the B&H catalog: Fujinon 14.5-45mm T2.0 Premier PL Zoom Lens:$99,800 Canon CINE-Servo 50-1000mm T5.0-8.9 (PL mount) - $70,200 2

By comparison, an ARRI ALEXA camera body, widely considered as the current gold standard in digital cinematography, retails for around $45,000, or less than half the cost of the Fujinon lens. 3.) Big lenses are heavy! Both the Fujinon and the Canon lenses listed above weigh in at over 14 pounds. These are not the kind of lenses you throw in a back pack and tote around all day. The Multiple Mirror Telescope (MMT) Solution- Maybe it s time to re-think how we capture photographic images. Instead of trying to make bigger lenses with higher spatial resolving power, why not make groups of smaller lenses and sensors work together to form high resolution aggregate images? This approach has several compelling advantages: 1.) Image sensor manufacturers prefer to produce smaller-sized image sensors. The product category driving most image sensor manufacturers today is the smartphone/mobile device. The production volumes for these far outstrip any other application. These devices tend to use image sensors that have an optical form factor of ½ inch or less. Also, the production yields for these smaller sensors are much higher than for large area sensors. Why is that? Simple geometry. The odds of getting an image sensor with no defective pixels are much higher for a sensor with a small area array than with a larger area array. So a silicon wafer fabricated with 2000 small sensors on it may have 1600 perfect sensors, while the same size wafer with 20 large sensors may only have 10 perfect sensors. 2.) Image sensors are getting smarter. In the early days of digital photography (1990 2000), image sensors (CCDs and CMOS) were no more than arrays of photodiodes connected to electronics designed solely to transfer the photodiode signal to external image processing electronics. Now it is not uncommon for image sensors to incorporate complete image processing pipelines, so that the output of the image sensor is not just raw pixels but de-bayered, color-corrected formatted images. Given the additional capabilities made possible by Moore s Law in electronics, it is not unreasonable to consider groups of image sensors that can communicate with each other during the image capture process to optimize parameters like spatial resolution, color fidelity or dynamic range. 3

3.) Small lenses are getting better. When cameras began to be integrated with smart phones and mobile devices, the lenses for these tended to be simple, low resolution designs with low-cost plastic optical components. As the image sensors for mobile devices have improved, and competition among lens manufacturers has increased, the image quality from these small lenses has increased dramatically. A number of Chinese lens manufacturers like Evetar (www.leadingoptics.com) and Genius (www.gseo.com) are now producing S-mount lenses with characteristics rivaling those of DSLR lenses. In fact, the GoPro Hero cameras (www.gopro.com), arguably the most popular action cameras ever built and a workhorse of cinema and broadcasting, use these types of lenses. Some Notable Examples There are already some efforts underway to use multiple lens / multiple sensor designs to achieve the next level in digital imaging performance. LIGHT A Silicon Valley startup, Light (www.light.co): aims to put a bunch of small lenses, each paired with its own image sensor, into smartphones and other gadgets. They ll fire simultaneously when you take a photo, and software will automatically combine the images. This way, Light believes, it can fit the quality and zoom of a bulky, expensive DSLR camera into much smaller, cheaper packages even phones. (MIT Technology Review, 4/17/2015) 4

AWARE On a more ambitious scale, a DARPA-funded project at Duke University, dubbed AWARE (www.disp.duke.edu/projects/aware), focuses on design and manufacturing of microcameras as a platform for scalable supercameras. Their first prototype camera produced a staggering 1 giga-pixel image based on 98 micro-optic elements and image sensors covering a 120 by 40 field of view. The optical design of this system is somewhat of a hybrid as it uses a combination of a single monocentric objective lens in conjunction with an array of identical secondary lenses along the focal surface of the objective. (See diagram below.) 5

Challenges and Prospects If the success of the segmented-optic approach in astronomy is any indication, the future for this technology looks bright. However, the technical challenges facing this approach are not trivial. Some of issues that need to be addressed before segmented optics cameras have mainstream success are listed below: 1.) Good image quality from multiple lenses depends on how well individual lenses are matched to each other. This was a lesson learned from 3D stereo camera developers. The process of producing optical elements (glass or plastic), coating them with the appropriate filters and assembling them into useful imaging lenses can lead to significant variations in lens performance characteristics like distortion, MTF and color rendition. Developers of multiple lens camera systems will either need to have tight control over the lenses used in these systems or develop back end processing to correct for differences among lenses. 2.) Segmented lens/sensor cameras require tight opto-mechanical tolerances. Possibly the most critical design task for these types of cameras is maintaining good and consistent alignment not just between one lens element and it s associated image sensor, but also among groups of lens/image sensor elements. With image sensor pixel spacing approaching 1 micron, a mechanical tolerance of +/- 0.1mm means an uncertainty of 100 or more pixels between images. While this uncertainty can be compensated to some extent by back end processing, the best approach is to reduce the uncertainty on the front end with good opto-mechanical design. 3.) These cameras require massive image processing. The ability to stitch and blend multiple images together to form a seamless whole has been around since the days of space probes in the 1970s. However, it was not uncommon back then to take several hours (or days) to create a final image. To make a successful commercial product, all of these operations must now happen within seconds (or possibly microseconds.) It is no accident that companies like Light are looking for software and algorithm developers rather than optical engineers. It has been said that we are entering the age of Computational Photography, where the image processing applied to a raw captured image is now more important than the raw image itself. If these obstacles can be overcome, and the MMT analogy gives us reason to believe that they will, we can look forward to a quantum leap in photographic technology that may signal the end of Big Optics as a tool for image capture. 6