Joint digital-optical design of imaging systems for grayscale objects

Size: px
Start display at page:

Download "Joint digital-optical design of imaging systems for grayscale objects"

Transcription

1 Joint digital-optical design of imaging systems for grayscale objects M. Dirk Robinson and David G. Stork Ricoh Innovations 2882 Sand Hill Rd, Suite 115 Menlo Park, CA ABSTRACT In many imaging applications, the objects of interest have broad range of strongly correlated spectral components. For example, the spectral components of grayscale objects such as media printed with black ink or toner are nearly perfectly correlated spatially. We describe how to exploit such correlation during the design of electro-optical imaging systems to achieve greater imaging performance and lower optical component cost. These advantages are achieved by jointly optimizing optical, detector, and digital image processing subsystems using a unified statistical imaging performance measure. The resulting optical systems have lower F # and greater depth-of-field than systems that do not exploit spectral correlations. Keywords: spectral correlation, spectral coding, extended depth-of-field, image processing, optimization, digital imaging, end-to-end design 1. INTRODUCTION As shown in recent work, 1 optimizing digital imaging systems by simultaneously designing both the optical and digital subsystems provides significant advantages over the traditional sequential imaging system design methods. Specifically, simple digital filters can restore the spatial contrast using spatial correlation information about source objects. 1 Such a joint design approach is based on analyzing the entire optical imaging system as a linear system. In this approach, the imaging system is modelled as y = H(Φ)x + n, (1) where H represents the system s point spread function, Φ the collection of the optical design parameters (lens thickness, curvatures, glass types, etc), x the ideally captured digital image, and n the noise inherent to photodetection. End-to-end or joint optimization of the optical and digital system is achieved by minimizing the predicted mean-square-error (MSE) as defined by E { Ry x 2}, (2) where E represents the statistical expectation operation and R represents the digital filtering subsystem. The statistical expectation operation considers the correlation of the random noise as well as the spatial correlation of the object. Under this framework, both the optical design parameters Φ and the digital filtering subsystem R are varied to find the MSE-optimal imaging system. Basic monochromatic imaging systems designed in this way achieve better contrast at improved signal-to-noise ratios (SNR) while relaxing the optical requirements in terms of aberrations. 1 We extend this concept to include specialized imaging systems in which the objects of interest posses strong spectral correlations. For example, barcode images and many paper documents are typically printed Send correspondence to: {dirkr,stork}@rii.ricoh.com

2 in black and white. The spectral reflectance of these objects are nearly perfectly correlated at every spatial location. In other words, the radiance distribution of the objects is very similar across a range of wavelengths. In the traditional design approach, a system designer would typically utilize a photodetector having a single spectral filter applied to all pixels uniformly since the goals of the imaging system is not include extracting color information. For instance, the imaging system designer might choose a single-filter or monochromatic CMOS or CCD detector array and apply only an infrared (IR) filter to capture a range of wavelengths. In this paper, we explore an alternate approach which applies different color filters to different pixels to segment the spectrum even though the final image is to be grayscale. We call our approach spectrally-coded grayscale imaging. In this paper, we introduce a new end-to-end design methodology that considers this spectral correlation information during the design of both the optics and the image processing subsystems Our approach relaxes the requirements on the optical aberrations and enhances imaging capabilities, such as extending the depthof-field. First, in Sect. 2 we describe how specialized image processing utilizes the spectral correlation found in grayscale objects to extract information across multiple color channels. Second, in Sect. 3 we describe how to jointly optimize both the optical and the digital subsystems to improve imaging performance and enable new capabilities such as extended depth-of-field imaging. We also illustrate this method for a simple-three lens imaging system. We conclude with some speculations on further directions of this joint design. 2. SPECTRALLY-CODED GRAYSCALE IMAGING The Spectrally-coded grayscale imaging is a method of encoding image information in different parts of the spectrum and then decoding this information by a combination of spectral segmentation using optical filters and digital processing. Suppose that we begin with an ideal two-dimensional grayscale image represented by the vector x s. We assume that the object has the same spatial intensity distribution throughout a range of wavelengths (e.g., visible spectrum). Furthermore, we assume that the detector has a set of color filters to segment this spectrum. In this paper, we will explore the common tri-color filter which segments the visible spectrum into Red, Green, and Blue channel images, but the extension to other filters and a greater number of filters is straightforward. A similar concept has been applied in confocal microscopy to simultaneously image at multiple object depths. 2 The captured image for each color channel is related to the unknown ideal grayscale image according to y c = H c (Φ)x s + n c, c [R, G, B], (3) where Φ again represents the collection of optical design parameters associated with the imaging system such as the lens curvatures, glass types, or element spacings. The H c (Φ) term represents the sampled pointspread function for the cth color channel image. The term n represents the additive noise having standard deviation σ. For the time being, we assume that the noise power is uniform over the different color channel images. The goal of the image processing is to estimate the unknown high-resolution image x s from the three noisy and blurry color images, represented by {y c }. The MSE-optimal estimate of the ideal grayscale image x s is given by xˆs = ( H T (Φ)H c (Φ) + σ 2 C 1 ) 1 H T (Φ)y c, (4) c c x c c where C x represents the correlation matrix which captures the prior information about the spatial correlation or smoothness of the unknown signal. This form is a variant of the single-color Wiener filter model described in our previous work on digital-optical optimization. 1 Equation 4shows that the estimate of the grayscale image is a weighted average of the sharpened color channel images, and thus this estimation strategy requires

3 only simple linear digital filters. Such filtering requires minimal computational resources. Estimating the grayscale image using the linear filtering of Eq. 4 produces images having an MSE given by ( ) 1 MSE(Φ) = T r 1 σ 2 H c (Φ) T H c (Φ) + C 1 s. (5) c In the traditional approach, a grayscale image is captured by means of a single color channel. To acquire a high quality image, the optical system must either use a very narrow-band spectral filter and sacrifice SNR due to lost photons, or ensure that color aberrations are minimized. The latter approach is common when the system has limited control over the radiance of the object. In such a case, the captured image is equivalent to Eq. 3 while having only a single channel. In this case, minimizing axial chromatic aberration or axial color becomes very important. Axial chromatic aberration describes the inability of the optical system to bring different wavelengths of light to focus at a single focal plane. 3 For example, while the red wavelength image is well focussed at the focal plane, the blue wavelengths image is out of focus due to the dispersive nature of refractive lenses. When imaging broadband sources, the standard optical design method attempts to bring the collection of wavelengths to a single focus and thereby ensuring high contrast images across the visible spectral range of the object. Minimizing axial color ensures that the effective point spread function, the point spread function after integrating over the range of wavelengths, yields sharp images. For such a case, the eigenvalues of the system matrix H s, locally the modulation transfer function (MTF) values, are large and so preserve image contrast throughout the range of spatial frequencies. The standard practice to minimize axial chromatic aberrations involves choosing lens materials with suitable dispersions to balance the aberrations. For example, in a triplet lens system, the first and third lens elements (positively powered elements) are made of Crown glasses (high Abbe numbers) while the second negative lens element is made of Flint glass (low Abbe numbers). In this way, the opposing chromatic aberrations are balanced. When employing multiple color channels, the optical system for a grayscale image need not provide high quality images across the entire range of wavelengths, but the collection of color channels must provide all information needed to reconstruct the grayscale image. For example, if one color channel provides strong tangential contrast but weak sagittal contrast and another color channel provides strong sagittal contrast but weak tangential contrast, the combination of the two color channels can provide all the information required to estimate the grayscale image accurately. This approach thus relaxes the traditionally expensive constraints on optical subsystem performance and enables new classes of imaging systems. Combining the information using Eq. 4 requires knowledge of the system s point-spread function for each color channel. In applications where the object distance d is not fixed, however, this depth information is difficult to obtain. In such a case, the image systems PSF matrix is a function of the unknown object distance d and expressed in H c (Φ, d). Combining the information across the multiple color channels requires knowledge of the object depth d. Estimating object depth from a single image is a notoriously difficult problem to solve. 4 What makes the problem difficult is the typical lack of a contrast signature revealing the object s depth. The effect of depth-dependent defocus manifests as a blurry or soft images, but attributing image softness to either defocus or merely a spatially-smooth object radiance map is difficult. One standard approach to estimating depth involves acquiring multiple images focused at different depth planes after which the depth can be estimated. 4 The multiple images provide the necessary information to distinguish the image signal from the unknown depth-dependent defocus. Analogous to the depth-from-defocus approaches, 4 we propose optimizing the wavelength-dependent point-spread functions across multiple color channels to encode the object depth. In this way, we infer the object s depth by analyzing its associated axial color aberrations. The depth-dependent blur associated with

4 the multiple color channel images allows us to estimate the object depth. The maximum likelihood approach to estimating the object depth requires maximizing the function ( ) T ( ) 1 ( ) J(d) = H T (Φ)y c H T (Φ)H c (Φ) + σ 2 C 1 H T (Φ)y c. (6) c c x c c c c The accuracy of these methods hinges on the variation of the system matrix H c across the multiple color channels. We verify the depth-dependent variation of this sharpness measure with respect to the object depth in the next section. After estimating the object depth d, we obtain the system matrices H c (Φ, d) and can estimate the grayscale image via Eq. 4. In the case of extended depth-of-field imaging, often the out-of-focus color channel provides little information about the grayscale image because the severe defocus eliminates the image signal. A simple approximation to Eq. 4 is to use only the sharpest color channel image. Similar to autofocus algorithms, we estimate image sharpness by filtering each color channel image with a high-pass spatial filter (e.g. standard Laplacian filter 5 ) and compute the energy of the filtered images. A reasonable estimate of object depth d is obtained by fitting the relative sharpness of the different color channel images to a model. We find that this approach can provide reasonable estimates of the grayscale image as long as the axial color abberation is not too severe. Enabling extended depth-of-field imaging requires that we provide good MSE performance over a range of object depths. The average MSE performance over the desired range of depths is computed by sampling the MSE at K different depth points and then averaging according to 1 P (Φ) = MSE(Φ, di ), (7) K i where d i represents samples within the depth range indexed by i. We choose the set of depths to correspond to equal depth ranges in terms of diopters. This can be approximated by evenly sampling in the depth-offocus space. For our current implementation, we assume that the signal s spatial correlation is fractal in nature and so C x does not depend on the object distance. Other signals, such as bar codes, may have a correlation structure which changes according to the object depth according to the effective magnification at different object distances. 3. GRAYSCALE TRIPLET IMAGING SYSTEM In the previous sections, we explained how prior information about spectral correlation allows the use of spectrally encoding depth information to extended the imaging system s depth-of-field. In this section, we analyze a triplet lens design based on this spectral-coding design principle. Specifically, we compare the performances of a triplet imaging systems designed using the traditional, single color filter architecture with one designed using the digital-optical design framework and spectral coding Triplet Specifications The triplet specifications correspond roughly to a 40 degree field-of-view (FOV) VGA web-camera specifications and a 1/5 sensor as shown in Table 1. The triplet system comprises two glass spherical elements and a third plastic aspheric element which corrects field errors. The plastic element is defined by even-ordered aspheric surfaces up to the 8th order rotationally-symmetric polynomial. The 20 optical design variables include the spherical lens curvatures, the aspheric terms, the lens and air thicknesses, and the glass or plastic types.

5 1 sensor size 5 resolution VGA pixel pitch 4.5 µm spectral range focal length 4.75 mm FOV 40 Glass Types Glass,Glass,Plastic Max. Chief Ray Angle 16 Track Length 7 mm Max. F # 3.0 Max. Distortion 3.0 % Table 1. Triplet system specifications 3.2. Traditional Triplet First, we used a traditional methods to optimize the triplet system focussing three test wavelengths (8, 4, 2 nm) onto a single focal plane at a working distance of 750 mm. We achieved this by balancing glass types in order to minimize chromatic aberration. The merit function used to optimize the optical system was based on the RMS optical path difference (OPD) wavefront error. The upper left side of Fig. 1 shows this aberration minimizing design. After global optimization, the design form followed a traditional positive-negative-positive triplet form in crown-flint-crown glass. After optimization using the Schott catalog the glass types are N-FK51A and N-SF10. The plastic is a high index, low dispersion COC type plastic E48R. We find that the design provides acceptable performance at F # 3.0. Below this f-number, the lens begins to suffer from a loss in contrast over the range of wavelengths. The curves below the graph in Fig. 1 show the field curvature for the three different RGB test wavelengths. The curve shows that the optical system does a reasonable job of focussing all three color channels onto a single focal plane. To achieve this, however, the system suffers from a bit of astigmatism Spectral Coding Triplet In the second design approach, we assume that the sensor uses a standard set of RGB color filters to segment the spectrum. We optimized the optical design using a merit function based on the average MSE over a range of seven depth locations according to Eq. 7. We use a simple spatial covariance model where the covariance between neighboring pixels is given by 0.9 k where k is the spatial separation in pixels. 5 We assume the systems s SNR is 40 db. We achieve joint optimization of both the optical and digital processing subsystems by using the user-defined operand capability of Zemax, a commercially-available lens design software tool. We created a user defined operand to compute the predicted MSE according to Eq. 7. In this fashion, we can leverage the optimization capabilities of the Zemax lens design software. 1 The depths were chosen to uniformly sample the depth-of-focus range for a nominal object distance of 750 mm. The depth locations used during optimization were infinity, 2000, 1000, 750, 380, 255, 190, and 150 mm. Again, we performed global optimization over the optical design parameters using the traditional triplet as the starting design. The resulting design is shown in the upper right side of Fig. 1. The triplet form again follows a positivenegative-positive design form. The glass types, however, are both high index flints N-SF6 and LASF32. The plastic is also a low Abbe number polycarbonate plastic which is much less expensive than the high Abbe number COC plastic used in the traditional design. The design achieves increased light gathering capacity (1.5X) over the traditional design. The curves in the bottom right of Fig. 1 show the field curvature for the three color wavelengths. The field curvature plots show the strong separation between the three wavelengths focal plane due to strong axial color aberrations in the spectrally-coded system.

6 Traditional: F# 3.0 Spectral Coded: F# mm 6.4 mm Traditional Spectral Coded Figure 1. The lens on the left represents the traditional optical design approach which focusses all three wavelengths at a single focal plane. The bottom left curve shows the field curvature plots for the three test wavelengths (RGB). The traditional optical system brings the three color planes into focus at nearly the same focal plane. The system does, however, suffer from a bit of astigmatism. The spectral-coding design on the right achieves increased light gathering capacity (1.5X). The field curvature plots show the strong separation between the three wavelengths focal plane due to strong axial color aberrations Depth-of-field Comparison We compare the effective depth-of-field performance of our traditional, single channel, triplet and our spectrally-coded triplet. In grayscale imaging the final image quality depends on the spectral sensitivity of the detectors. The top graphs of Fig. 2 show the spectral sensitivities for the single channel detector (left) and the spectrally-coded system (right). Both system cover same spectral range and reflect the typical sensitivities of commercially-available sensors combined with IR cutoff filters. The curve in the bottom left of Fig. 2 shows the through-focus polychromatic MTF for the traditional system using a set of nine wavelengths weighted by spectral sensitivity of the single channel sensor. The through-focus polychromatic MTF shows the MTF at 50 lp/mm spatial frequency for the on-axis field point. As we would expect, the MTF falls off as we move away from the focal plane due to defocus. The system shows a maximum depth-of-focus of about 120 µm. While this design provides reasonable quality at the proper focal distance, the limited depth-of-field shows that the imaging system will provide defocussed images at a working distance of about 250 mm which would require a focal shift of about 150 µm. Unfortunately, a fixed-focus lens system designed under this constraint will work only within a particular depth range around the chosen object distance. Furthermore, the depth-of-field decreases with decreased F # (and hence light sensitivity) creating an undesirable tradeoff. The curves on the bottom right show the polychromatic MTF curves for the spectrally-coded system. The MTF again reflects a polychromatic average over different sets of nine spectrally-weighted wavelengths. As expected, the different color channels focus at different depth planes. The depth-of-focus for the spectrally coded system is extended to about 240 µm. Over the focal range, however, at least one of the color channels

7 provides strong contrast. Furthermore, for every depth plane, at least one of the wavelengths has significantly poor contrast suggesting the ability to infer object depth using color channel image sharpness. Also, the spectrally-coded triplet has an increased light gathering capacity of F # 2.4. Three-channel Spectral Sensitivities Spectral Sensitivity Spectral Sensitivity Single-channel Spectral Sensitivity Wavelength MTF vs 50 lp/mm Wavelength MTF vs 50 lp/mm Figure 2. The top left curve shows the spectral sensitivity of the single-channel sensor used in the traditional imaging system. The bottom left curve shows the polychromatic MTF of the traditional triplet system using nine spectrally-weighted wavelengths for the on-axis field point. The system shows reasonable performance within about ± 60 µm from the nominal focus position. The top right curve shows the spectral sensitivity for the three color channel spectrally-coded triplet system. The three curves on the bottom right show the polychromatic MTF of the using different spectrally-weighted wavelengths according the color channel sensitivities for the on-axis field point. The three curves reveal the three different color focal planes. The thick line shows the effective MTF combining the best MTF among the three color channels. The system shows that at least one of the color channels is in focus within ± 120 µm of the nominal focal distance Image Simulation Results We simulated images produced by these two systems using our imaging system simulation tool.1 Our image simulation tool is similar to that described in6 with the extension of adding multi-spectral weighting according to the pixel spectral sensitivity. We use a traditional Air Forces resolution target as our simulated object, a binary target having either uniformly broadband radiance or none simulating a perfectly correlated object. When simulating the captured images, we use three spectral samples per color channel to simulate the spectral integration of the detector. Figure 3 compares portions of the target at two different object depths. We show a cropped portion of the image so as to reveal the resolution properties of the image. The leftmost image column shows the target at 1.5 meters (top) and 130 millimeters (bottom) for the traditional single channel imaging system. The image shows good contrast at 1.5 meters, but very poor contrast at 130 millimeters due to depth-of-field limitations. The images in the second and third columns show the same images for the Red channel (middle) and Blue channel (right) at the two object depths. At 1.5 meters, the red image shows almost equivalent contrast to the single channel imaging system while the blue image shows very low contrast. Alternatively, the red image shows very low contrast when the object is at 130 millimeters, while the blue image shows sharp contrast. These images visualize the contrast predicted by the through focus MTF shown in Fig. 2.

8 Single Channel Image Spectral Coded Red Channel Image Spectral Coded Blue Channel Image Single Channel Image Spectral Coded Red Channel Image Spectral Coded Blue Channel Image Figure 3. The left column shows the images of a resolution target produced by the traditional single channel imaging system for an object located at 1500 mm (top) and 130 mm (bottom). The system shows good contrast at 1500 mm but very low contrast at the short working distance of 130 mm due to limited depth-of-field. The second and third columns shows the red and blue channel images respectively. The red image shows good contrast at 1500 mm while the blue image shows good contrast at 130 mm. To evaluate the ability to discern object depth using the spectrally-coded imaging system, we simulated imaging the resolution target located at 2 m, 1 m, 750 mm, 380 mm, 250 m, 190 mm, and 130 mm. We then applied a simple Laplacian sharpness filter to each of the color channel images and compared the relative magnitude of the filtered images for a small patch near the center of the image. To compute the relative magnitude, we first integrate the energy in the filtered images for a pixel patch at the center of the image for three color channels. Then, we normalize the three color channel values so that the sum of the energies equals one. This approximates the percentages of the total high frequency image energy present in the three different color channel images. Figure 4 compares the relative sharpness for the three color channel images as a function of object depth. The images in the left and right columns show the magnitude of the Laplacian filtered images for the object located at 130 mm and 1.5 m respectively. The curves demonstrate the clear relationship between object distance and relative sharpness. The simplest application of the relative sharpness is to find the sharpest image over the collection of image planes to use as the captured image. 4. CONCLUSIONS AND FUTURE WORK We presented a novel framework for analyzing and designing imaging systems for the class of grayscale objects where the different spectral bands have strong correlation. We demonstrated how such correlation information enables the system designer not only to relax strict requirements on optical aberrations, but also enables new imaging capabilities such as extended depth-of-field imaging through spectral coding. We used our new design philosophy to design an extended depth-of-field triplet imaging system verifying the increased depth-of-field through image simulation. Finally, we highlighted the additional advantage of this new approach to building simple object depth estimation using simple filter-based sharpness measures. The cost of such image processing is low enough to make this approach attractive for grayscale imaging systems. The current work suggests numerous future research directions. In this report, we ignored the loss in

9 R G B Relative Sharpness Measure Relative Sharpness vs Object Depth Red Channel Green Channel Blue Channel R G B 130mm Object Depth (m) 1500mm Figure 4. The curve shows the relative sharpness for at the center of each color channel image versus the distance of the object from the camera. The sharpness was computed as the average magnitude of the color channel filtered by a Laplacian filter. The column of images at the left and right visualize the magnitude of the Laplacian filtered images at 130 mm and 1.5 m respectively. The curves demonstrate the clear relationship between object distance and the computed sharpness metric. spatial resolution due to spatial multiplexing of the color filters. The most practical application of the spectral coding will undoubtedly required such spatial multiplexing. Future research could address the processing required to restore resolution by combining the multiple color channel sub-images. In our work, we focussed on strongly correlated objects such as bar codes or grayscale documents. A general multispectral analysis of general images could reveal spectral correlations, albeit weaker, in general images. Future work might address methods for leveraging this weak spectral correlation information when designing general purpose imaging systems to improve F # and increase depth-of-field. REFERENCES 1. D. G. Stork and M. D. Robinson, Theoretical foundations for joint digital-optical analysis of electrooptical imaging systems, Applied Optics, April H. Tiziani and H. Uhde, Three-dimensional image sensing by chromatic confocal microscopy, Applied Optics 33(10), pp , J. W. Goodman, Introduction to Fourier Optics, McGraw-Hill, New York, NY, second ed., S. Chaudhuri and A. Rajagopalan, Depth from defocus: A real aperture imaging aproach, Spinger Verlag, A. K. Jain, Fundamentals of Digital Image Processing, Prentice Hall, Englewood Cliffs, New Jersey, 1 ed., P. Maeda, P. B. Catrysse, and B. A. Wandell, Integrating lens design with digital camera simulation, SPIEProceedings SPIE Electronic Imaging, San Jose, CA 5678, pp , February 2005.

Optimisation. Lecture 3

Optimisation. Lecture 3 Optimisation Lecture 3 Objectives: Lecture 3 At the end of this lecture you should: 1. Understand the use of Petzval curvature to balance lens components 2. Know how different aberrations depend on field

More information

Using molded chalcogenide glass technology to reduce cost in a compact wide-angle thermal imaging lens

Using molded chalcogenide glass technology to reduce cost in a compact wide-angle thermal imaging lens Using molded chalcogenide glass technology to reduce cost in a compact wide-angle thermal imaging lens George Curatu a, Brent Binkley a, David Tinch a, and Costin Curatu b a LightPath Technologies, 2603

More information

IMAGE SENSOR SOLUTIONS. KAC-96-1/5" Lens Kit. KODAK KAC-96-1/5" Lens Kit. for use with the KODAK CMOS Image Sensors. November 2004 Revision 2

IMAGE SENSOR SOLUTIONS. KAC-96-1/5 Lens Kit. KODAK KAC-96-1/5 Lens Kit. for use with the KODAK CMOS Image Sensors. November 2004 Revision 2 KODAK for use with the KODAK CMOS Image Sensors November 2004 Revision 2 1.1 Introduction Choosing the right lens is a critical aspect of designing an imaging system. Typically the trade off between image

More information

Imaging Optics Fundamentals

Imaging Optics Fundamentals Imaging Optics Fundamentals Gregory Hollows Director, Machine Vision Solutions Edmund Optics Why Are We Here? Topics for Discussion Fundamental Parameters of your system Field of View Working Distance

More information

Introduction to Optical Modeling. Friedrich-Schiller-University Jena Institute of Applied Physics. Lecturer: Prof. U.D. Zeitner

Introduction to Optical Modeling. Friedrich-Schiller-University Jena Institute of Applied Physics. Lecturer: Prof. U.D. Zeitner Introduction to Optical Modeling Friedrich-Schiller-University Jena Institute of Applied Physics Lecturer: Prof. U.D. Zeitner The Nature of Light Fundamental Question: What is Light? Newton Huygens / Maxwell

More information

Sequential Ray Tracing. Lecture 2

Sequential Ray Tracing. Lecture 2 Sequential Ray Tracing Lecture 2 Sequential Ray Tracing Rays are traced through a pre-defined sequence of surfaces while travelling from the object surface to the image surface. Rays hit each surface once

More information

Optical design of a high resolution vision lens

Optical design of a high resolution vision lens Optical design of a high resolution vision lens Paul Claassen, optical designer, paul.claassen@sioux.eu Marnix Tas, optical specialist, marnix.tas@sioux.eu Prof L.Beckmann, l.beckmann@hccnet.nl Summary:

More information

Telecentric Imaging Object space telecentricity stop source: edmund optics The 5 classical Seidel Aberrations First order aberrations Spherical Aberration (~r 4 ) Origin: different focal lengths for different

More information

For rotationally symmetric optical

For rotationally symmetric optical : Maintaining Uniform Temperature Fluctuations John Tejada, Janos Technology, Inc. An optical system is athermalized if its critical performance parameters (such as MTF, BFL, EFL, etc.,) do not change

More information

Performance Factors. Technical Assistance. Fundamental Optics

Performance Factors.   Technical Assistance. Fundamental Optics Performance Factors After paraxial formulas have been used to select values for component focal length(s) and diameter(s), the final step is to select actual lenses. As in any engineering problem, this

More information

Lecture Notes 10 Image Sensor Optics. Imaging optics. Pixel optics. Microlens

Lecture Notes 10 Image Sensor Optics. Imaging optics. Pixel optics. Microlens Lecture Notes 10 Image Sensor Optics Imaging optics Space-invariant model Space-varying model Pixel optics Transmission Vignetting Microlens EE 392B: Image Sensor Optics 10-1 Image Sensor Optics Microlens

More information

2.2 Wavefront Sensor Design. Lauren H. Schatz, Oli Durney, Jared Males

2.2 Wavefront Sensor Design. Lauren H. Schatz, Oli Durney, Jared Males Page: 1 of 8 Lauren H. Schatz, Oli Durney, Jared Males 1 Pyramid Wavefront Sensor Overview The MagAO-X system uses a pyramid wavefront sensor (PWFS) for high order wavefront sensing. The wavefront sensor

More information

Exam Preparation Guide Geometrical optics (TN3313)

Exam Preparation Guide Geometrical optics (TN3313) Exam Preparation Guide Geometrical optics (TN3313) Lectures: September - December 2001 Version of 21.12.2001 When preparing for the exam, check on Blackboard for a possible newer version of this guide.

More information

Advanced Lens Design

Advanced Lens Design Advanced Lens Design Lecture 3: Aberrations I 214-11-4 Herbert Gross Winter term 214 www.iap.uni-jena.de 2 Preliminary Schedule 1 21.1. Basics Paraxial optics, imaging, Zemax handling 2 28.1. Optical systems

More information

ME 297 L4-2 Optical design flow Analysis

ME 297 L4-2 Optical design flow Analysis ME 297 L4-2 Optical design flow Analysis Nayer Eradat Fall 2011 SJSU 1 Are we meeting the specs? First order requirements (after scaling the lens) Distortion Sharpness (diffraction MTF-will establish depth

More information

Optical Design with Zemax for PhD

Optical Design with Zemax for PhD Optical Design with Zemax for PhD Lecture 7: Optimization II 26--2 Herbert Gross Winter term 25 www.iap.uni-jena.de 2 Preliminary Schedule No Date Subject Detailed content.. Introduction 2 2.2. Basic Zemax

More information

Criteria for Optical Systems: Optical Path Difference How do we determine the quality of a lens system? Several criteria used in optical design

Criteria for Optical Systems: Optical Path Difference How do we determine the quality of a lens system? Several criteria used in optical design Criteria for Optical Systems: Optical Path Difference How do we determine the quality of a lens system? Several criteria used in optical design Computer Aided Design Several CAD tools use Ray Tracing (see

More information

Confocal Imaging Through Scattering Media with a Volume Holographic Filter

Confocal Imaging Through Scattering Media with a Volume Holographic Filter Confocal Imaging Through Scattering Media with a Volume Holographic Filter Michal Balberg +, George Barbastathis*, Sergio Fantini % and David J. Brady University of Illinois at Urbana-Champaign, Urbana,

More information

Be aware that there is no universal notation for the various quantities.

Be aware that there is no universal notation for the various quantities. Fourier Optics v2.4 Ray tracing is limited in its ability to describe optics because it ignores the wave properties of light. Diffraction is needed to explain image spatial resolution and contrast and

More information

OCT Spectrometer Design Understanding roll-off to achieve the clearest images

OCT Spectrometer Design Understanding roll-off to achieve the clearest images OCT Spectrometer Design Understanding roll-off to achieve the clearest images Building a high-performance spectrometer for OCT imaging requires a deep understanding of the finer points of both OCT theory

More information

ADAPTIVE CORRECTION FOR ACOUSTIC IMAGING IN DIFFICULT MATERIALS

ADAPTIVE CORRECTION FOR ACOUSTIC IMAGING IN DIFFICULT MATERIALS ADAPTIVE CORRECTION FOR ACOUSTIC IMAGING IN DIFFICULT MATERIALS I. J. Collison, S. D. Sharples, M. Clark and M. G. Somekh Applied Optics, Electrical and Electronic Engineering, University of Nottingham,

More information

Optical Design with Zemax

Optical Design with Zemax Optical Design with Zemax Lecture : Correction II 3--9 Herbert Gross Summer term www.iap.uni-jena.de Correction II Preliminary time schedule 6.. Introduction Introduction, Zemax interface, menues, file

More information

Laboratory experiment aberrations

Laboratory experiment aberrations Laboratory experiment aberrations Obligatory laboratory experiment on course in Optical design, SK2330/SK3330, KTH. Date Name Pass Objective This laboratory experiment is intended to demonstrate the most

More information

3D light microscopy techniques

3D light microscopy techniques 3D light microscopy techniques The image of a point is a 3D feature In-focus image Out-of-focus image The image of a point is not a point Point Spread Function (PSF) 1D imaging 2D imaging 3D imaging Resolution

More information

Performance of extended depth of field systems and theoretical diffraction limit

Performance of extended depth of field systems and theoretical diffraction limit Performance of extended depth of field systems and theoretical diffraction limit Frédéric Guichard, Frédéric Cao, Imène Tarchouna, Nicolas Bachelard DxO Labs, 3 Rue Nationale, 92100 Boulogne, France ABSTRACT

More information

Tutorial Zemax 8: Correction II

Tutorial Zemax 8: Correction II Tutorial Zemax 8: Correction II 2012-10-11 8 Correction II 1 8.1 High-NA Collimator... 1 8.2 Zoom-System... 6 8.3 New Achromate and wide field system... 11 8 Correction II 8.1 High-NA Collimator An achromatic

More information

Exercise 1 - Lens bending

Exercise 1 - Lens bending Exercise 1 - Lens bending Most of the aberrations change with the bending of a lens. This is demonstrated in this exercise. a) Establish a lens with focal length f = 100 mm made of BK7 with thickness 5

More information

Performance Comparison of Spectrometers Featuring On-Axis and Off-Axis Grating Rotation

Performance Comparison of Spectrometers Featuring On-Axis and Off-Axis Grating Rotation Performance Comparison of Spectrometers Featuring On-Axis and Off-Axis Rotation By: Michael Case and Roy Grayzel, Acton Research Corporation Introduction The majority of modern spectrographs and scanning

More information

High resolution extended depth of field microscopy using wavefront coding

High resolution extended depth of field microscopy using wavefront coding High resolution extended depth of field microscopy using wavefront coding Matthew R. Arnison *, Peter Török #, Colin J. R. Sheppard *, W. T. Cathey +, Edward R. Dowski, Jr. +, Carol J. Cogswell *+ * Physical

More information

Lens Design II. Lecture 2: Structural modifications Herbert Gross. Winter term

Lens Design II. Lecture 2: Structural modifications Herbert Gross. Winter term Lens Design II Lecture 2: Structural modifications 26--26 Herbert Gross Winter term 26 www.iap.uni-jena.de 2 Preliminary Schedule 9.. Aberrations and optimization Repetition 2 26.. Structural modifications

More information

Some lens design methods. Dave Shafer David Shafer Optical Design Fairfield, CT #

Some lens design methods. Dave Shafer David Shafer Optical Design Fairfield, CT # Some lens design methods Dave Shafer David Shafer Optical Design Fairfield, CT 06824 #203-259-1431 shaferlens@sbcglobal.net Where do we find our ideas about how to do optical design? You probably won t

More information

The Importance of Wavelengths on Optical Designs

The Importance of Wavelengths on Optical Designs 1 The Importance of Wavelengths on Optical Designs Bad Kreuznach, Oct. 2017 2 Introduction A lens typically needs to be corrected for many different parameters as e.g. distortion, astigmatism, spherical

More information

Microscope anatomy, image formation and resolution

Microscope anatomy, image formation and resolution Microscope anatomy, image formation and resolution Ian Dobbie Buy this book for your lab: D.B. Murphy, "Fundamentals of light microscopy and electronic imaging", ISBN 0-471-25391-X Visit these websites:

More information

Waves & Oscillations

Waves & Oscillations Physics 42200 Waves & Oscillations Lecture 33 Geometric Optics Spring 2013 Semester Matthew Jones Aberrations We have continued to make approximations: Paraxial rays Spherical lenses Index of refraction

More information

DIMENSIONAL MEASUREMENT OF MICRO LENS ARRAY WITH 3D PROFILOMETRY

DIMENSIONAL MEASUREMENT OF MICRO LENS ARRAY WITH 3D PROFILOMETRY DIMENSIONAL MEASUREMENT OF MICRO LENS ARRAY WITH 3D PROFILOMETRY Prepared by Benjamin Mell 6 Morgan, Ste156, Irvine CA 92618 P: 949.461.9292 F: 949.461.9232 nanovea.com Today's standard for tomorrow's

More information

BIG PIXELS VS. SMALL PIXELS THE OPTICAL BOTTLENECK. Gregory Hollows Edmund Optics

BIG PIXELS VS. SMALL PIXELS THE OPTICAL BOTTLENECK. Gregory Hollows Edmund Optics BIG PIXELS VS. SMALL PIXELS THE OPTICAL BOTTLENECK Gregory Hollows Edmund Optics 1 IT ALL STARTS WITH THE SENSOR We have to begin with sensor technology to understand the road map Resolution will continue

More information

Aberrations and adaptive optics for biomedical microscopes

Aberrations and adaptive optics for biomedical microscopes Aberrations and adaptive optics for biomedical microscopes Martin Booth Department of Engineering Science And Centre for Neural Circuits and Behaviour University of Oxford Outline Rays, wave fronts and

More information

An Indian Journal FULL PAPER. Trade Science Inc. Parameters design of optical system in transmitive star simulator ABSTRACT KEYWORDS

An Indian Journal FULL PAPER. Trade Science Inc. Parameters design of optical system in transmitive star simulator ABSTRACT KEYWORDS [Type text] [Type text] [Type text] ISSN : 0974-7435 Volume 10 Issue 23 BioTechnology 2014 An Indian Journal FULL PAPER BTAIJ, 10(23), 2014 [14257-14264] Parameters design of optical system in transmitive

More information

Study on Imaging Quality of Water Ball Lens

Study on Imaging Quality of Water Ball Lens 2017 2nd International Conference on Mechatronics and Information Technology (ICMIT 2017) Study on Imaging Quality of Water Ball Lens Haiyan Yang1,a,*, Xiaopan Li 1,b, 1,c Hao Kong, 1,d Guangyang Xu and1,eyan

More information

Observational Astronomy

Observational Astronomy Observational Astronomy Instruments The telescope- instruments combination forms a tightly coupled system: Telescope = collecting photons and forming an image Instruments = registering and analyzing the

More information

Optical System Design

Optical System Design Phys 531 Lecture 12 14 October 2004 Optical System Design Last time: Surveyed examples of optical systems Today, discuss system design Lens design = course of its own (not taught by me!) Try to give some

More information

12.4 Alignment and Manufacturing Tolerances for Segmented Telescopes

12.4 Alignment and Manufacturing Tolerances for Segmented Telescopes 330 Chapter 12 12.4 Alignment and Manufacturing Tolerances for Segmented Telescopes Similar to the JWST, the next-generation large-aperture space telescope for optical and UV astronomy has a segmented

More information

Geometric optics & aberrations

Geometric optics & aberrations Geometric optics & aberrations Department of Astrophysical Sciences University AST 542 http://www.northerneye.co.uk/ Outline Introduction: Optics in astronomy Basics of geometric optics Paraxial approximation

More information

Sensors and Sensing Cameras and Camera Calibration

Sensors and Sensing Cameras and Camera Calibration Sensors and Sensing Cameras and Camera Calibration Todor Stoyanov Mobile Robotics and Olfaction Lab Center for Applied Autonomous Sensor Systems Örebro University, Sweden todor.stoyanov@oru.se 20.11.2014

More information

Point Spread Function. Confocal Laser Scanning Microscopy. Confocal Aperture. Optical aberrations. Alternative Scanning Microscopy

Point Spread Function. Confocal Laser Scanning Microscopy. Confocal Aperture. Optical aberrations. Alternative Scanning Microscopy Bi177 Lecture 5 Adding the Third Dimension Wide-field Imaging Point Spread Function Deconvolution Confocal Laser Scanning Microscopy Confocal Aperture Optical aberrations Alternative Scanning Microscopy

More information

WaveMaster IOL. Fast and Accurate Intraocular Lens Tester

WaveMaster IOL. Fast and Accurate Intraocular Lens Tester WaveMaster IOL Fast and Accurate Intraocular Lens Tester INTRAOCULAR LENS TESTER WaveMaster IOL Fast and accurate intraocular lens tester WaveMaster IOL is an instrument providing real time analysis of

More information

Some of the important topics needed to be addressed in a successful lens design project (R.R. Shannon: The Art and Science of Optical Design)

Some of the important topics needed to be addressed in a successful lens design project (R.R. Shannon: The Art and Science of Optical Design) Lens design Some of the important topics needed to be addressed in a successful lens design project (R.R. Shannon: The Art and Science of Optical Design) Focal length (f) Field angle or field size F/number

More information

Overview: Integration of Optical Systems Survey on current optical system design Case demo of optical system design

Overview: Integration of Optical Systems Survey on current optical system design Case demo of optical system design Outline Chapter 1: Introduction Overview: Integration of Optical Systems Survey on current optical system design Case demo of optical system design 1 Overview: Integration of optical systems Key steps

More information

Exercises Advanced Optical Design Part 5 Solutions

Exercises Advanced Optical Design Part 5 Solutions 2014-12-09 Manuel Tessmer M.Tessmer@uni-jena.dee Minyi Zhong minyi.zhong@uni-jena.de Herbert Gross herbert.gross@uni-jena.de Friedrich Schiller University Jena Institute of Applied Physics Albert-Einstein-Str.

More information

Reflectors vs. Refractors

Reflectors vs. Refractors 1 Telescope Types - Telescopes collect and concentrate light (which can then be magnified, dispersed as a spectrum, etc). - In the end it is the collecting area that counts. - There are two primary telescope

More information

Optical Zoom System Design for Compact Digital Camera Using Lens Modules

Optical Zoom System Design for Compact Digital Camera Using Lens Modules Journal of the Korean Physical Society, Vol. 50, No. 5, May 2007, pp. 1243 1251 Optical Zoom System Design for Compact Digital Camera Using Lens Modules Sung-Chan Park, Yong-Joo Jo, Byoung-Taek You and

More information

Lens Design I. Lecture 10: Optimization II Herbert Gross. Summer term

Lens Design I. Lecture 10: Optimization II Herbert Gross. Summer term Lens Design I Lecture : Optimization II 5-6- Herbert Gross Summer term 5 www.iap.uni-jena.de Preliminary Schedule 3.. Basics.. Properties of optical systrems I 3 7.5..5. Properties of optical systrems

More information

1.1 Singlet. Solution. a) Starting setup: The two radii and the image distance is chosen as variable.

1.1 Singlet. Solution. a) Starting setup: The two radii and the image distance is chosen as variable. 1 1.1 Singlet Optimize a single lens with the data λ = 546.07 nm, object in the distance 100 mm from the lens on axis only, focal length f = 45 mm and numerical aperture NA = 0.07 in the object space.

More information

Modulation Transfer Function

Modulation Transfer Function Modulation Transfer Function The Modulation Transfer Function (MTF) is a useful tool in system evaluation. t describes if, and how well, different spatial frequencies are transferred from object to image.

More information

7x P/N C1601. General Description

7x P/N C1601. General Description METRICZOOM SWIR 7x METRIC ZOOM-SWIR ZOOM 7x P/N C1601 C General Description This family of high resolution METRIC ZOOM SWIR lenses image from 0.9 to 2.3 µm making them especially well-suited well for surveillance,

More information

digital film technology Resolution Matters what's in a pattern white paper standing the test of time

digital film technology Resolution Matters what's in a pattern white paper standing the test of time digital film technology Resolution Matters what's in a pattern white paper standing the test of time standing the test of time An introduction >>> Film archives are of great historical importance as they

More information

October 7, Peter Cheimets Smithsonian Astrophysical Observatory 60 Garden Street, MS 5 Cambridge, MA Dear Peter:

October 7, Peter Cheimets Smithsonian Astrophysical Observatory 60 Garden Street, MS 5 Cambridge, MA Dear Peter: October 7, 1997 Peter Cheimets Smithsonian Astrophysical Observatory 60 Garden Street, MS 5 Cambridge, MA 02138 Dear Peter: This is the report on all of the HIREX analysis done to date, with corrections

More information

Lecture 4: Geometrical Optics 2. Optical Systems. Images and Pupils. Rays. Wavefronts. Aberrations. Outline

Lecture 4: Geometrical Optics 2. Optical Systems. Images and Pupils. Rays. Wavefronts. Aberrations. Outline Lecture 4: Geometrical Optics 2 Outline 1 Optical Systems 2 Images and Pupils 3 Rays 4 Wavefronts 5 Aberrations Christoph U. Keller, Leiden University, keller@strw.leidenuniv.nl Lecture 4: Geometrical

More information

Introduction to Light Microscopy. (Image: T. Wittman, Scripps)

Introduction to Light Microscopy. (Image: T. Wittman, Scripps) Introduction to Light Microscopy (Image: T. Wittman, Scripps) The Light Microscope Four centuries of history Vibrant current development One of the most widely used research tools A. Khodjakov et al. Major

More information

Resampling in hyperspectral cameras as an alternative to correcting keystone in hardware, with focus on benefits for optical design and data quality

Resampling in hyperspectral cameras as an alternative to correcting keystone in hardware, with focus on benefits for optical design and data quality Resampling in hyperspectral cameras as an alternative to correcting keystone in hardware, with focus on benefits for optical design and data quality Andrei Fridman Gudrun Høye Trond Løke Optical Engineering

More information

Overview. Pinhole camera model Projective geometry Vanishing points and lines Projection matrix Cameras with Lenses Color Digital image

Overview. Pinhole camera model Projective geometry Vanishing points and lines Projection matrix Cameras with Lenses Color Digital image Camera & Color Overview Pinhole camera model Projective geometry Vanishing points and lines Projection matrix Cameras with Lenses Color Digital image Book: Hartley 6.1, Szeliski 2.1.5, 2.2, 2.3 The trip

More information

Optical Perspective of Polycarbonate Material

Optical Perspective of Polycarbonate Material Optical Perspective of Polycarbonate Material JP Wei, Ph. D. November 2011 Introduction Among the materials developed for eyeglasses, polycarbonate is one that has a number of very unique properties and

More information

COURSE NAME: PHOTOGRAPHY AND AUDIO VISUAL PRODUCTION (VOCATIONAL) FOR UNDER GRADUATE (FIRST YEAR)

COURSE NAME: PHOTOGRAPHY AND AUDIO VISUAL PRODUCTION (VOCATIONAL) FOR UNDER GRADUATE (FIRST YEAR) COURSE NAME: PHOTOGRAPHY AND AUDIO VISUAL PRODUCTION (VOCATIONAL) FOR UNDER GRADUATE (FIRST YEAR) PAPER TITLE: BASIC PHOTOGRAPHIC UNIT - 3 : SIMPLE LENS TOPIC: LENS PROPERTIES AND DEFECTS OBJECTIVES By

More information

A simulation tool for evaluating digital camera image quality

A simulation tool for evaluating digital camera image quality A simulation tool for evaluating digital camera image quality Joyce Farrell ab, Feng Xiao b, Peter Catrysse b, Brian Wandell b a ImagEval Consulting LLC, P.O. Box 1648, Palo Alto, CA 94302-1648 b Stanford

More information

WaveMaster IOL. Fast and accurate intraocular lens tester

WaveMaster IOL. Fast and accurate intraocular lens tester WaveMaster IOL Fast and accurate intraocular lens tester INTRAOCULAR LENS TESTER WaveMaster IOL Fast and accurate intraocular lens tester WaveMaster IOL is a new instrument providing real time analysis

More information

FRESNEL LENS TOPOGRAPHY WITH 3D METROLOGY

FRESNEL LENS TOPOGRAPHY WITH 3D METROLOGY FRESNEL LENS TOPOGRAPHY WITH 3D METROLOGY INTRO: Prepared by Benjamin Mell 6 Morgan, Ste156, Irvine CA 92618 P: 949.461.9292 F: 949.461.9232 nanovea.com Today's standard for tomorrow's materials. 2010

More information

Immersion Lithography Micro-Objectives

Immersion Lithography Micro-Objectives Immersion Lithography Micro-Objectives James Webb and Louis Denes Corning Tropel Corporation, 60 O Connor Rd, Fairport, NY 14450 (U.S.A.) 585-388-3500, webbj@corning.com, denesl@corning.com ABSTRACT The

More information

OPTICAL IMAGING AND ABERRATIONS

OPTICAL IMAGING AND ABERRATIONS OPTICAL IMAGING AND ABERRATIONS PARTI RAY GEOMETRICAL OPTICS VIRENDRA N. MAHAJAN THE AEROSPACE CORPORATION AND THE UNIVERSITY OF SOUTHERN CALIFORNIA SPIE O P T I C A L E N G I N E E R I N G P R E S S A

More information

ECEN 4606, UNDERGRADUATE OPTICS LAB

ECEN 4606, UNDERGRADUATE OPTICS LAB ECEN 4606, UNDERGRADUATE OPTICS LAB Lab 2: Imaging 1 the Telescope Original Version: Prof. McLeod SUMMARY: In this lab you will become familiar with the use of one or more lenses to create images of distant

More information

Lecture 2: Geometrical Optics. Geometrical Approximation. Lenses. Mirrors. Optical Systems. Images and Pupils. Aberrations.

Lecture 2: Geometrical Optics. Geometrical Approximation. Lenses. Mirrors. Optical Systems. Images and Pupils. Aberrations. Lecture 2: Geometrical Optics Outline 1 Geometrical Approximation 2 Lenses 3 Mirrors 4 Optical Systems 5 Images and Pupils 6 Aberrations Christoph U. Keller, Leiden Observatory, keller@strw.leidenuniv.nl

More information

Lecture 2: Geometrical Optics. Geometrical Approximation. Lenses. Mirrors. Optical Systems. Images and Pupils. Aberrations.

Lecture 2: Geometrical Optics. Geometrical Approximation. Lenses. Mirrors. Optical Systems. Images and Pupils. Aberrations. Lecture 2: Geometrical Optics Outline 1 Geometrical Approximation 2 Lenses 3 Mirrors 4 Optical Systems 5 Images and Pupils 6 Aberrations Christoph U. Keller, Leiden Observatory, keller@strw.leidenuniv.nl

More information

Compact Dual Field-of-View Telescope for Small Satellite Payloads

Compact Dual Field-of-View Telescope for Small Satellite Payloads Compact Dual Field-of-View Telescope for Small Satellite Payloads James C. Peterson Space Dynamics Laboratory 1695 North Research Park Way, North Logan, UT 84341; 435-797-4624 Jim.Peterson@sdl.usu.edu

More information

Image Formation and Camera Design

Image Formation and Camera Design Image Formation and Camera Design Spring 2003 CMSC 426 Jan Neumann 2/20/03 Light is all around us! From London & Upton, Photography Conventional camera design... Ken Kay, 1969 in Light & Film, TimeLife

More information

Chapters 1 & 2. Definitions and applications Conceptual basis of photogrammetric processing

Chapters 1 & 2. Definitions and applications Conceptual basis of photogrammetric processing Chapters 1 & 2 Chapter 1: Photogrammetry Definitions and applications Conceptual basis of photogrammetric processing Transition from two-dimensional imagery to three-dimensional information Automation

More information

INFRARED IMAGING-PASSIVE THERMAL COMPENSATION VIA A SIMPLE PHASE MASK

INFRARED IMAGING-PASSIVE THERMAL COMPENSATION VIA A SIMPLE PHASE MASK Romanian Reports in Physics, Vol. 65, No. 3, P. 700 710, 2013 Dedicated to Professor Valentin I. Vlad s 70 th Anniversary INFRARED IMAGING-PASSIVE THERMAL COMPENSATION VIA A SIMPLE PHASE MASK SHAY ELMALEM

More information

ABOUT RESOLUTION. pco.knowledge base

ABOUT RESOLUTION. pco.knowledge base The resolution of an image sensor describes the total number of pixel which can be used to detect an image. From the standpoint of the image sensor it is sufficient to count the number and describe it

More information

Optical Performance of Nikon F-Mount Lenses. Landon Carter May 11, Measurement and Instrumentation

Optical Performance of Nikon F-Mount Lenses. Landon Carter May 11, Measurement and Instrumentation Optical Performance of Nikon F-Mount Lenses Landon Carter May 11, 2016 2.671 Measurement and Instrumentation Abstract In photographic systems, lenses are one of the most important pieces of the system

More information

Explanation of Aberration and Wavefront

Explanation of Aberration and Wavefront Explanation of Aberration and Wavefront 1. What Causes Blur? 2. What is? 4. What is wavefront? 5. Hartmann-Shack Aberrometer 6. Adoption of wavefront technology David Oh 1. What Causes Blur? 2. What is?

More information

LENSES. a. To study the nature of image formed by spherical lenses. b. To study the defects of spherical lenses.

LENSES. a. To study the nature of image formed by spherical lenses. b. To study the defects of spherical lenses. Purpose Theory LENSES a. To study the nature of image formed by spherical lenses. b. To study the defects of spherical lenses. formation by thin spherical lenses s are formed by lenses because of the refraction

More information

Use of Computer Generated Holograms for Testing Aspheric Optics

Use of Computer Generated Holograms for Testing Aspheric Optics Use of Computer Generated Holograms for Testing Aspheric Optics James H. Burge and James C. Wyant Optical Sciences Center, University of Arizona, Tucson, AZ 85721 http://www.optics.arizona.edu/jcwyant,

More information

Photographic zoom fisheye lens design for DSLR cameras

Photographic zoom fisheye lens design for DSLR cameras Photographic zoom fisheye lens design for DSLR cameras Yufeng Yan Jose Sasian Yufeng Yan, Jose Sasian, Photographic zoom fisheye lens design for DSLR cameras, Opt. Eng. 56(9), 095103 (2017), doi: 10.1117/1.OE.56.9.095103.

More information

LENSES. INEL 6088 Computer Vision

LENSES. INEL 6088 Computer Vision LENSES INEL 6088 Computer Vision Digital camera A digital camera replaces film with a sensor array Each cell in the array is a Charge Coupled Device light-sensitive diode that converts photons to electrons

More information

Is Aberration-Free Correction the Best Goal

Is Aberration-Free Correction the Best Goal Is Aberration-Free Correction the Best Goal Stephen Burns, PhD, Jamie McLellan, Ph.D., Susana Marcos, Ph.D. The Schepens Eye Research Institute. Schepens Eye Research Institute, an affiliate of Harvard

More information

Design of Temporally Dithered Codes for Increased Depth of Field in Structured Light Systems

Design of Temporally Dithered Codes for Increased Depth of Field in Structured Light Systems Design of Temporally Dithered Codes for Increased Depth of Field in Structured Light Systems Ricardo R. Garcia University of California, Berkeley Berkeley, CA rrgarcia@eecs.berkeley.edu Abstract In recent

More information

Optics: An Introduction

Optics: An Introduction It is easy to overlook the contribution that optics make to a system; beyond basic lens parameters such as focal distance, the details can seem confusing. This Tech Tip presents a basic guide to optics

More information

PHYSICS OPTICS. Mr Rishi Gopie

PHYSICS OPTICS. Mr Rishi Gopie OPTICS Mr Rishi Gopie Ray Optics II Images formed by lens maybe real or virtual and may have different characteristics and locations that depend on: i) The type of lens involved, whether converging or

More information

On spatial resolution

On spatial resolution On spatial resolution Introduction How is spatial resolution defined? There are two main approaches in defining local spatial resolution. One method follows distinction criteria of pointlike objects (i.e.

More information

Measurement of Texture Loss for JPEG 2000 Compression Peter D. Burns and Don Williams* Burns Digital Imaging and *Image Science Associates

Measurement of Texture Loss for JPEG 2000 Compression Peter D. Burns and Don Williams* Burns Digital Imaging and *Image Science Associates Copyright SPIE Measurement of Texture Loss for JPEG Compression Peter D. Burns and Don Williams* Burns Digital Imaging and *Image Science Associates ABSTRACT The capture and retention of image detail are

More information

Ch 24. Geometric Optics

Ch 24. Geometric Optics text concept Ch 24. Geometric Optics Fig. 24 3 A point source of light P and its image P, in a plane mirror. Angle of incidence =angle of reflection. text. Fig. 24 4 The blue dashed line through object

More information

Performance of Image Intensifiers in Radiographic Systems

Performance of Image Intensifiers in Radiographic Systems DOE/NV/11718--396 LA-UR-00-211 Performance of Image Intensifiers in Radiographic Systems Stuart A. Baker* a, Nicholas S. P. King b, Wilfred Lewis a, Stephen S. Lutz c, Dane V. Morgan a, Tim Schaefer a,

More information

Exposure schedule for multiplexing holograms in photopolymer films

Exposure schedule for multiplexing holograms in photopolymer films Exposure schedule for multiplexing holograms in photopolymer films Allen Pu, MEMBER SPIE Kevin Curtis,* MEMBER SPIE Demetri Psaltis, MEMBER SPIE California Institute of Technology 136-93 Caltech Pasadena,

More information

Fig Color spectrum seen by passing white light through a prism.

Fig Color spectrum seen by passing white light through a prism. 1. Explain about color fundamentals. Color of an object is determined by the nature of the light reflected from it. When a beam of sunlight passes through a glass prism, the emerging beam of light is not

More information

PROCEEDINGS OF SPIE. Measurement of low-order aberrations with an autostigmatic microscope

PROCEEDINGS OF SPIE. Measurement of low-order aberrations with an autostigmatic microscope PROCEEDINGS OF SPIE SPIEDigitalLibrary.org/conference-proceedings-of-spie Measurement of low-order aberrations with an autostigmatic microscope William P. Kuhn Measurement of low-order aberrations with

More information

Defense Technical Information Center Compilation Part Notice

Defense Technical Information Center Compilation Part Notice UNCLASSIFIED Defense Technical Information Center Compilation Part Notice ADPO 11345 TITLE: Measurement of the Spatial Frequency Response [SFR] of Digital Still-Picture Cameras Using a Modified Slanted

More information

Lens Design I. Lecture 10: Optimization II Herbert Gross. Summer term

Lens Design I. Lecture 10: Optimization II Herbert Gross. Summer term Lens Design I Lecture : Optimization II 8-6- Herbert Gross Summer term 8 www.iap.uni-jena.de Preliminary Schedule - Lens Design I 8.4. Basics 9.4. Properties of optical systems I 3 6.4. Properties of optical

More information

Spectral Analysis of the LUND/DMI Earthshine Telescope and Filters

Spectral Analysis of the LUND/DMI Earthshine Telescope and Filters Spectral Analysis of the LUND/DMI Earthshine Telescope and Filters 12 August 2011-08-12 Ahmad Darudi & Rodrigo Badínez A1 1. Spectral Analysis of the telescope and Filters This section reports the characterization

More information

Lecture PowerPoint. Chapter 25 Physics: Principles with Applications, 6 th edition Giancoli

Lecture PowerPoint. Chapter 25 Physics: Principles with Applications, 6 th edition Giancoli Lecture PowerPoint Chapter 25 Physics: Principles with Applications, 6 th edition Giancoli 2005 Pearson Prentice Hall This work is protected by United States copyright laws and is provided solely for the

More information

Notes from Lens Lecture with Graham Reed

Notes from Lens Lecture with Graham Reed Notes from Lens Lecture with Graham Reed Light is refracted when in travels between different substances, air to glass for example. Light of different wave lengths are refracted by different amounts. Wave

More information

Optical and mechanical parameters. 100 mm N. of elements 20.5 mm Dimensions 11.7 degrees Weight F/N = 4 (fixed) N.A.

Optical and mechanical parameters. 100 mm N. of elements 20.5 mm Dimensions 11.7 degrees Weight F/N = 4 (fixed) N.A. OB SWIR 100 LENS OB-SWIR100/4 P/N C0416 General Description This family of high resolution SWIR lenses image from 0.9 2.3 µmm making them especially well-suited for PCB inspection, special laser applications,

More information

phone extn.3662, fax: , nitt.edu ABSTRACT

phone extn.3662, fax: , nitt.edu ABSTRACT Analysis of Refractive errors in the human eye using Shack Hartmann Aberrometry M. Jesson, P. Arulmozhivarman, and A.R. Ganesan* Department of Physics, National Institute of Technology, Tiruchirappalli

More information