Imaging Particle Analysis: The Importance of Image Quality
|
|
- Virginia Rodgers
- 5 years ago
- Views:
Transcription
1 Imaging Particle Analysis: The Importance of Image Quality Lew Brown Technical Director Fluid Imaging Technologies, Inc. Abstract: Imaging particle analysis systems can derive much more information about particles compared to classical volumetric-based systems (light obscuration, electrozone sensing, laser diffraction, etc.), which can only measure particle size in terms of the Equivalent Spherical Diameter (ESD) of a sphere corresponding to the derived volume of the particle. Imaging-based systems can also measure many different parameters for each particle which represent both shape and gray scale information. This gives these systems the capability to differentiate amongst non-similar particles contained in a heterogeneous mixture. These image-based systems can classify particles into different types where the volumetric systems can only report a size for each particle. One drawback to imaging particle analysis systems is that they are limited in the lower end of the size range they can properly characterize (1). However, in applications within these systems' resolution range, they do offer significant advantages. This paper will discuss how the quality of the images directly affects the ability of the system to make proper measurements, and therefore affects the ability of the system to differentiate between different particle types. Using real-world data and examples, we come to the succinct (but rather unscientific) conclusion: Fuzzy Images = Fuzzy Measurements = Fuzzy Classifications I. What is Image Quality Everyone has a basic understanding of the term image quality that is purely subjective, or qualitative in nature. We make qualitative judgements on image quality all the time, especially when it comes to our own pictures taken using consumer cameras, etc. Figure One shows an example of a simple landscape image where one version of the image is clearly of higher quality than the other. The lower image is perceived as poorer quality due to its lack of sharpness and improper color balance. It is crucially important in any discussion of image quality to recognize that many factors that contribute to image quality can only be looked at from a systems level. As an example, the quality of an image produced by a digital camera is the sum of several parts, the lens ability to correctly transmit the information, the sensor s resolution and the noise and processing characteristics of the camera electronics. Furthermore, if the image produced by this camera is being judged for quality by a human being, then the characteristics of the observer also come into play: what is the quality of the person s eyesight (do they need corrective lenses, and if so, are they wearing them?), the distance to the image he/she Figure 1: Landscape Image with good image quality (top) and bad image quality (bottom). is viewing, the lighting conditions, etc. Last, but not least, in the case of a human observer, the situation becomes even muddier due to the fact that each person s brain processes the visual information differently! Some of the common descriptors used when discussing image quality include sharpness (amount of detail conveyed), dynamic range (range of light levels captured), contrast, distortion (usually caused by the optics), tonal mapping and artifacts (usually produced by the electronics or software processing of the image). Of all of these, the factor that has 1
2 the largest influence on perceived image quality is usually sharpness. Although this is sometimes simplified to focus, it really is a combination of many factors (focus being only one) that indicate the overall amount of spatial information captured by the image. Due to the high degree of variability of subjective image quality, scientists have found it necessary to come up with qualitative measurements for image quality. Usually, these measurements are systems level as discussed above, except in some cases where a particular element of an optical system can be isolated, such as the lens only. II. Qualitative Measures of Image Quality As stated above, the most critical factor in image quality is sharpness, or the ability of the system to capture image detail. If an image is blurry or out of focus to begin with, other factors such as tonal mapping, color balance and dynamic range really do not have as much meaning because of the lack of information contained in the image to begin with. Sharpness is most commonly equated to the term resolution, which can be a bit of a misnomer and cause for confusion, since there are many different kinds of resolution (sensor, spectral, display, etc.). However, in general, the resolution of a system is defined as the ability to reproduce spatial detail in the object being imaged (2). In systems that include the human eye, resolution is a combination of sharpness and acutance. Acutance is a perceived sharpness caused by the changes in brightness on edge transitions (3). Those familiar with Photoshop use operations such as unsharp masking and sharpening frequently; these operations increase perceived sharpness through acutance changes only, the actual resolution of the image is unchanged. As an example, the lower image in Figure 1 was produced merely by applying a blurring filter and lowering the image acutance, but it still contains the same resolution (in pixels) as the original image! image a standardized test target consisting of groups of bars of increasing numbers of line pairs/millimeter, and the largest bars that the imaging system can not discern are considered the limit of the system s resolving power. One of the most commonly used targets is the 1951 USAF Resolution test Target (Figure 2). The problem with merely using lp/mm to measure sharpness is that there is still a subjective judgment left in terms of the word discern in the definition above. To overcome this limitation, a measurement called Modulation Transfer Function (MTF) was introduced that measures the ability of the system to transfer contrast in the frequency domain Figure 2: USAF 1951 Resolution Test Target (5). The modulation (or variance) of contrast transmitted by the system from target to image, ranging from 100% (unattenuated) to 0% (completely attenuated), is measured over a varying range of spatial frequencies and plotted (5). A typical graph of MTF is shown in Figure 3. The modulation of contrast is measured by using the system to image a target such as the USAF target above, or more commonly a sinusoidal (in intensity) target of varying frequency, and measuring the transmitted contrast versus the original on the target. The commonly accepted modulation representing the limit of a system s ability to transfer information is 9%, which is also known as the Rayleigh diffraction limit (5). Since imaging particle analysis involves measurements made on images by a computer, we need not worry about acutance in this discussion, as it is a perceived-only (via the human eye) characteristic. However, it is important to remember that when any of these images are viewed by the human eye, the subjectivity of vision and acutance can cause confusion! Quantitative measurement of image sharpness is typically expressed in terms of the ability of the system to distinguish closely-spaced, high contrast pairs of lines in a given distance, most typically as line pairs per millimeter or lp/mm (4). In order to measure this, the system being evaluated is used to Figure 3: Typical MTF Plot (6) 2
3 A final common, related descriptor for image sharpness is the Point Spread Function (PSF), which can be thought of as the spatial domain equivalent of MTF (7). The PSF is the intensity response of the system when it images a point light source at infinity (i.e. perfectly focused). If the system was "perfect", the image would also be a point. However, diffraction and other factors cause the image of the point to be "spread" out, and in fact diffraction will cause intensity "rings" to emanate from the center. This representation is also referred to as the "airy disk". A 3-D plot and 2-D intensity image of the airy disk is shown in Figure 4 (8). Figure 5: Particle images of 10µm calibrated beads, "sharp" focus (left) and "blurry" focus (right). (NOTE: both images have extra blur caused by pixel-replicated zoom necessary to display for print). Figure 4: Airy disk representation of Point Spread Function (PSF) showing 3-D and 2-D intensity plots. (8) III. Variation in Image Measurements Caused by Varying Image Quality In imaging particle analysis, the measurements made on the particles are done on a thresholded binary image rather than the original gray-scale image (9). This is done by choosing a gray-scale threshold value and then declaring each pixel as particle or not particle based upon the actual particle image pixel gray-scale value compared to the threshold. Figure 5 shows two particle images of a 10µm calibration bead, one in focus and the other out of focus. It is important to note that these images have "artificial blur" in this diagram caused by having to upsample the original 72ppi screen image to a print resolution of 300ppi. An intensity profile can be generated for each by making a line across the particle (passing through the center) and then plotting distance versus intensity. Figure 6 shows the result of doing this for the two particles in Figure 5. As can be seen in Table 1, based on information from Figure 6, if both images are thresholded identically at an intensity value of 150, the sharp particle would have a diameter of 10.13µm, whereas the blurry particle would have a diameter of 10.96µm. Table 1 shows how the diameter for each particle would vary based upon a different threshold value. Note that the ESD varies by a total of 12.86µm between a threshold value of 100 and 200 for the blurry image, whereas the ESD only varies by a total of 1.67µm over the same range. Figure 6: "Intensity profile plots" for the two images in Figure 5. Upper plot is for the "sharp" image, lower plot is for the "blurred" image. Threshold Value Equivalent Spherical Diameter (ESD in µm) "Sharp" Image "Blurry" Image Table 1: ESD value for varying threshold levels on the two particles shown in Figure 5. 3
4 IV. Example of the Effect of Image Quality on Particle Measurements In order to concretely demonstrate how varying image quality affects measurements in imaging particle analysis, a simple experiment was run. Fluid Imaging Technologies FlowCAM was used to perform the experiment, where NIST traceable calibrated size beads were imaged by the system. The system was set up to use the 10X objective lens (which yields an overall system magnification 100X), and 10µm diameter beads. For these parameters, the FlowCAM would normally be set up using a 100µm (depth) flow cell in order to keep the particles in sharp focus. However, in this instance, a 300µm (depth) flow cell was used to insure that some of the particles imaged would be out of focus. One of the many measurements made by the FlowCAM s VisualSpreadsheet software is edge gradient. Edge gradient essentially measures the sharpness of edges in the image by looking at the slope of the gray scale change on an edge. High edge gradients indicate sharp edges, while low edge gradients indicate fuzzy edges. If the target is a high-contrast object, such as the beads in this experiment, then the particles with high edge gradient are seen as being in sharp focus whereas the ones with low edge gradient are seen as blurry. So the edge gradient measurement can be used to quickly separate high quality (sharp) bead images from low quality (blurry) bead images. After running the beads through the FlowCAM several times in this configuration, the diameter versus frequency graphs clearly showed a bimodal distribution for size: one peak at 10µm representing the beads in sharp focus, and another (wider) peak centered around 16µm representing the out-offocus beads. Using the VisualSpreadsheet software, one can quickly isolate the best quality images from each peak by merely selecting each peak and then sorting based upon edge gradient. Figure 5 shows the results from doing this, and clearly demonstrates the measurement change that occurs when the beads are not in sharp focus. The mean Equivalent Spherical Diameter (ESD) for the in-focus beads was 9.81µm, whereas the mean ESD for the out-of-focus beads was 15.64µm. Figure 5: Comparison of bead images and corresponding summary statistics from lower peak (left) and upper peak (right). The mean Equivalent Spherical Diameter (ESD) for the in-focus beads was 9.81µm, whereas the mean ESD for the out-of-focus beads was 15.64µm. 4
5 Figure 6: Close up view of 2 bead image sets (left), and with binary overlay (right). Note the higher degree of variability, both in size and in shape on the out-of-focus beads (right hand side of each screen shot). One might make the argument that this result is somewhat "rigged", since the FlowCAM's distance calibration was based upon sharply focused beads. However, one only needs to look at the standard deviation and coefficient of variability (CV) for the two data sets to see that the mean is only part of the story. The out-of-focus particles exhibit a significantly higher standard deviation (0.56µm versus 0.14µm) and CV (3.57% versus 1.46%) than do the in-focus particles. To put this in rather succinct (but unscientific) terms: Fuzzy Images = Fuzzy Measurements V. How Particle Measurements Affect Particle Classification As shown in the example above, image quality (in this case specifically, sharpness ) has a direct relationship to the quality and precision of measurements made on these particles from the images. The efficacy of any type of particle classification, whether value or statistical based (9), is going to be dependent on the quality of the measurements, and therefore directly dependent upon the original image quality produced by the system. In either type of classification, the measurements are used for the basis of characterizing the particle images. So, if we take the above example to its next logical step, we can assume that image quality will directly affect the ability for pattern recognition software to properly classify particles. To demonstrate this, another experiment was run using the FlowCAM. This time, rather than using beads, we wanted to use a real world sample containing non-spherical particles that vary slightly in shape, size, transparency and other attributes. A culture of the phytoplankton species Cosmarium Figure 7: Typical in-focus images of Cosmarium algae as captured by the FlowCAM. was used. This species has a fairly distinctive shape as can be seen from the sample images in Figure 7. The average size of these particles is around 50µm, so they are normally run in the FlowCAM using the 4X objective (approximately 40X overall magnification) with a 300µm (depth) flow cell. As before, in order to get particles both in and out of focus, the flow cell used was larger than normal (600µm in depth). After the sample had been run, two image libraries were built by selecting 8 Cosmarium algae images that represented typical in-focus images (Figure 7) and 8 that represented Figure 8: Typical out-of-focus images of Cosmarium algae as captured by the FlowCAM. 5
6 blurry images (Figure 8). These library images are then used as the basis for a statistical pattern recognition algorithm (9) whereby each particle in the run is compared statistically against the library particles to determine how closely they match the target particles. Figure 9 shows the overall results for one of the FlowCAM runs made with the Cosmarium algae culture. The left side window shows the summary statistics for the run: 4,500 particle images were collected and stored yielding a mean volume-weighted ESD of 52.91µm and a mean aspect ratio (width/length) of The right side window shows a random sampling of the particle images themselves; you can clearly see the variations in shape and size caused both by the organic variation of the species itself and due to the variation in focus through the flow cell depth. Figure 9: Summary results of FlowCAM run using Cosmarium culture. Figure 10: Typical images and summary statistics for particles found with the "sharp" image library. Figure 11: Typical images and summary statistics for particles found with the "blurry" image library. 6
7 At this point, the two statistical filters created from the libraries show in Figures 7 and 8 were run against the entire run of data (separately, not at the same time). Figure 10 shows some of the particle images found using the sharp library, along with the summary statistics for the filter. Figure 11 shows the same results when using the blurry library. Out of the 4,500 original particles, the sharp filter found 230 like particles representing a concentration of 1,029 particles/ ml, whereas the blurry filter only found 44 like particles representing a concentration of only 197 particles/ml. So, the sharp filter found an order of magnitude more particles than the blurry filter. This is due to the fact that the sharp library has tighter measurements, forming a tighter cluster in the n-dimensional pattern recognition space (9), thereby more specifically defining the shape and gray-scale attributes of the desired particles. The blurry library has a much looser, more ambiguous cluster to it due to the larger variance in all the measurements. This particular case shows a very simplistic situation where only a single particle type is present in the sample. In the real world, such as when studying a water sample with multiple algal species present, or a protein sample containing agglomerated proteins, silicone droplets and other foreign matter, the importance of sharp images in the libraries becomes greater. In these applications, the filters will be run at the same time and be mutually exclusive (a particle should not belong to two different types ). Not only will blurry filter images cause undercounting as in the example just shown, but they also will lead to false positives and false negatives in each class, thereby making the classification less accurate and less repeatable. To further expound on the succinct (but unscientific) conclusion found in section IV: Fuzzy Images = Fuzzy Measurements = Fuzzy Classifications VI. Conclusions While image quality can sometimes be a rather fleeting concept in the realm of the qualitative (subjective), digital image processing does not suffer from any ambiguity in this arena. This is because digital images are purely mathematical constructs, and thus can be evaluated in mathematical, quantitative terms. Since imaging particle analysis often deals with microscopic particles, it frequently will run into limitations based purely on resolution (1). Even when the particles are within the range that a particular imaging particle analysis system can resolve well, the quality of the image still greatly affects the ability of a system to classify particles of different types (let alone properly measure them!). Through two real-world examples, this paper has shown that image quality, in particular sharpness, greatly affects measurement quality and therefore the system s ability to properly classify particles of different types. Since the true strength of imaging particle analysis systems over volumetric-based systems (which can only measure particle size) is their ability to differentiate particles based upon shape and gray-scale, image quality is paramount to the value of one of these systems. As previously summarized: Fuzzy Images = Fuzzy Measurements = Fuzzy Classifications VII. References 1.) "Imaging Particle Analysis: Resolution and Sampling Considerations", Lew Brown, Fluid Imaging Technologies Web Site: 2.) Wikipedia entry on Optical Resolution: 3.) Wikipedia entry on Acutance: 4.) "Understanding Sharpness", Michael Reichmann, Luminous Landscape Web Site: 5.) "Understanding image sharpness part 1: Introduction to resolution and MTF curves", Norman Koran, tutorial on web site: 6.) MTF Plot from: technical-support/optics/modulation-transfer-function/ 7.) Wikipedia entry on Point Spread Function: en.wikipedia.org/wiki/point_spread_function 8.) Wikipedia entry on Airy Disk: wiki/airy_disk 9.) "Particle Image Understanding - A Primer", Lew Brown, Fluid Imaging Technologies Web Site: com/imaging-particle-analysis-white-papers.aspx 7
Sharpness, Resolution and Interpolation
Sharpness, Resolution and Interpolation Introduction There are a lot of misconceptions about resolution, camera pixel count, interpolation and their effect on astronomical images. Some of the confusion
More informationUsing Optics to Optimize Your Machine Vision Application
Expert Guide Using Optics to Optimize Your Machine Vision Application Introduction The lens is responsible for creating sufficient image quality to enable the vision system to extract the desired information
More informationOptical Performance of Nikon F-Mount Lenses. Landon Carter May 11, Measurement and Instrumentation
Optical Performance of Nikon F-Mount Lenses Landon Carter May 11, 2016 2.671 Measurement and Instrumentation Abstract In photographic systems, lenses are one of the most important pieces of the system
More informationBe aware that there is no universal notation for the various quantities.
Fourier Optics v2.4 Ray tracing is limited in its ability to describe optics because it ignores the wave properties of light. Diffraction is needed to explain image spatial resolution and contrast and
More informationAn Efficient Color Image Segmentation using Edge Detection and Thresholding Methods
19 An Efficient Color Image Segmentation using Edge Detection and Thresholding Methods T.Arunachalam* Post Graduate Student, P.G. Dept. of Computer Science, Govt Arts College, Melur - 625 106 Email-Arunac682@gmail.com
More informationOn spatial resolution
On spatial resolution Introduction How is spatial resolution defined? There are two main approaches in defining local spatial resolution. One method follows distinction criteria of pointlike objects (i.e.
More informationDetermining MTF with a Slant Edge Target ABSTRACT AND INTRODUCTION
Determining MTF with a Slant Edge Target Douglas A. Kerr Issue 2 October 13, 2010 ABSTRACT AND INTRODUCTION The modulation transfer function (MTF) of a photographic lens tells us how effectively the lens
More informationCoE4TN4 Image Processing. Chapter 3: Intensity Transformation and Spatial Filtering
CoE4TN4 Image Processing Chapter 3: Intensity Transformation and Spatial Filtering Image Enhancement Enhancement techniques: to process an image so that the result is more suitable than the original image
More informationFrequency Domain Enhancement
Tutorial Report Frequency Domain Enhancement Page 1 of 21 Frequency Domain Enhancement ESE 558 - DIGITAL IMAGE PROCESSING Tutorial Report Instructor: Murali Subbarao Written by: Tutorial Report Frequency
More informationImage analysis. CS/CME/BioE/Biophys/BMI 279 Oct. 31 and Nov. 2, 2017 Ron Dror
Image analysis CS/CME/BioE/Biophys/BMI 279 Oct. 31 and Nov. 2, 2017 Ron Dror 1 Outline Images in molecular and cellular biology Reducing image noise Mean and Gaussian filters Frequency domain interpretation
More informationLinear Gaussian Method to Detect Blurry Digital Images using SIFT
IJCAES ISSN: 2231-4946 Volume III, Special Issue, November 2013 International Journal of Computer Applications in Engineering Sciences Special Issue on Emerging Research Areas in Computing(ERAC) www.caesjournals.org
More informationCriteria for Optical Systems: Optical Path Difference How do we determine the quality of a lens system? Several criteria used in optical design
Criteria for Optical Systems: Optical Path Difference How do we determine the quality of a lens system? Several criteria used in optical design Computer Aided Design Several CAD tools use Ray Tracing (see
More informationExperiment 1: Fraunhofer Diffraction of Light by a Single Slit
Experiment 1: Fraunhofer Diffraction of Light by a Single Slit Purpose 1. To understand the theory of Fraunhofer diffraction of light at a single slit and at a circular aperture; 2. To learn how to measure
More informationTopic 6 - Optics Depth of Field and Circle Of Confusion
Topic 6 - Optics Depth of Field and Circle Of Confusion Learning Outcomes In this lesson, we will learn all about depth of field and a concept known as the Circle of Confusion. By the end of this lesson,
More informationloss of detail in highlights and shadows (noise reduction)
Introduction Have you printed your images and felt they lacked a little extra punch? Have you worked on your images only to find that you have created strange little halos and lines, but you re not sure
More informationOptical basics for machine vision systems. Lars Fermum Chief instructor STEMMER IMAGING GmbH
Optical basics for machine vision systems Lars Fermum Chief instructor STEMMER IMAGING GmbH www.stemmer-imaging.de AN INTERNATIONAL CONCEPT STEMMER IMAGING customers in UK Germany France Switzerland Sweden
More informationOptical design of a high resolution vision lens
Optical design of a high resolution vision lens Paul Claassen, optical designer, paul.claassen@sioux.eu Marnix Tas, optical specialist, marnix.tas@sioux.eu Prof L.Beckmann, l.beckmann@hccnet.nl Summary:
More informationComputer Vision. Howie Choset Introduction to Robotics
Computer Vision Howie Choset http://www.cs.cmu.edu.edu/~choset Introduction to Robotics http://generalrobotics.org What is vision? What is computer vision? Edge Detection Edge Detection Interest points
More information8.2 IMAGE PROCESSING VERSUS IMAGE ANALYSIS Image processing: The collection of routines and
8.1 INTRODUCTION In this chapter, we will study and discuss some fundamental techniques for image processing and image analysis, with a few examples of routines developed for certain purposes. 8.2 IMAGE
More informationExercise questions for Machine vision
Exercise questions for Machine vision This is a collection of exercise questions. These questions are all examination alike which means that similar questions may appear at the written exam. I ve divided
More informationDefense Technical Information Center Compilation Part Notice
UNCLASSIFIED Defense Technical Information Center Compilation Part Notice ADPO 11345 TITLE: Measurement of the Spatial Frequency Response [SFR] of Digital Still-Picture Cameras Using a Modified Slanted
More informationEvaluation of image quality of the compression schemes JPEG & JPEG 2000 using a Modular Colour Image Difference Model.
Evaluation of image quality of the compression schemes JPEG & JPEG 2000 using a Modular Colour Image Difference Model. Mary Orfanidou, Liz Allen and Dr Sophie Triantaphillidou, University of Westminster,
More informationNON UNIFORM BACKGROUND REMOVAL FOR PARTICLE ANALYSIS BASED ON MORPHOLOGICAL STRUCTURING ELEMENT:
IJCE January-June 2012, Volume 4, Number 1 pp. 59 67 NON UNIFORM BACKGROUND REMOVAL FOR PARTICLE ANALYSIS BASED ON MORPHOLOGICAL STRUCTURING ELEMENT: A COMPARATIVE STUDY Prabhdeep Singh1 & A. K. Garg2
More informationDigital Camera Technologies for Scientific Bio-Imaging. Part 2: Sampling and Signal
Digital Camera Technologies for Scientific Bio-Imaging. Part 2: Sampling and Signal Yashvinder Sabharwal, 1 James Joubert 2 and Deepak Sharma 2 1. Solexis Advisors LLC, Austin, TX, USA 2. Photometrics
More informationLAB MANUAL SUBJECT: IMAGE PROCESSING BE (COMPUTER) SEM VII
LAB MANUAL SUBJECT: IMAGE PROCESSING BE (COMPUTER) SEM VII IMAGE PROCESSING INDEX CLASS: B.E(COMPUTER) SR. NO SEMESTER:VII TITLE OF THE EXPERIMENT. 1 Point processing in spatial domain a. Negation of an
More informationAstigmatism Particle Tracking Velocimetry for Macroscopic Flows
1TH INTERNATIONAL SMPOSIUM ON PARTICLE IMAGE VELOCIMETR - PIV13 Delft, The Netherlands, July 1-3, 213 Astigmatism Particle Tracking Velocimetry for Macroscopic Flows Thomas Fuchs, Rainer Hain and Christian
More informationIMAGE SENSOR SOLUTIONS. KAC-96-1/5" Lens Kit. KODAK KAC-96-1/5" Lens Kit. for use with the KODAK CMOS Image Sensors. November 2004 Revision 2
KODAK for use with the KODAK CMOS Image Sensors November 2004 Revision 2 1.1 Introduction Choosing the right lens is a critical aspect of designing an imaging system. Typically the trade off between image
More informationApplication Note #548 AcuityXR Technology Significantly Enhances Lateral Resolution of White-Light Optical Profilers
Application Note #548 AcuityXR Technology Significantly Enhances Lateral Resolution of White-Light Optical Profilers ContourGT with AcuityXR TM capability White light interferometry is firmly established
More informationEducation in Microscopy and Digital Imaging
Contact Us Carl Zeiss Education in Microscopy and Digital Imaging ZEISS Home Products Solutions Support Online Shop ZEISS International ZEISS Campus Home Interactive Tutorials Basic Microscopy Spectral
More informationAcoustic resolution. photoacoustic Doppler velocimetry. in blood-mimicking fluids. Supplementary Information
Acoustic resolution photoacoustic Doppler velocimetry in blood-mimicking fluids Joanna Brunker 1, *, Paul Beard 1 Supplementary Information 1 Department of Medical Physics and Biomedical Engineering, University
More informationMigration from Contrast Transfer Function to ISO Spatial Frequency Response
IS&T's 22 PICS Conference Migration from Contrast Transfer Function to ISO 667- Spatial Frequency Response Troy D. Strausbaugh and Robert G. Gann Hewlett Packard Company Greeley, Colorado Abstract With
More informationCCD Automatic Gain Algorithm Design of Noncontact Measurement System Based on High-speed Circuit Breaker
2016 3 rd International Conference on Engineering Technology and Application (ICETA 2016) ISBN: 978-1-60595-383-0 CCD Automatic Gain Algorithm Design of Noncontact Measurement System Based on High-speed
More informationFig Color spectrum seen by passing white light through a prism.
1. Explain about color fundamentals. Color of an object is determined by the nature of the light reflected from it. When a beam of sunlight passes through a glass prism, the emerging beam of light is not
More informationECEN 4606, UNDERGRADUATE OPTICS LAB
ECEN 4606, UNDERGRADUATE OPTICS LAB Lab 2: Imaging 1 the Telescope Original Version: Prof. McLeod SUMMARY: In this lab you will become familiar with the use of one or more lenses to create images of distant
More informationChapter 2 Fourier Integral Representation of an Optical Image
Chapter 2 Fourier Integral Representation of an Optical This chapter describes optical transfer functions. The concepts of linearity and shift invariance were introduced in Chapter 1. This chapter continues
More informationin association with Getting to Grips with Printing
in association with Getting to Grips with Printing Managing Colour Custom profiles - why you should use them Raw files are not colour managed Should I set my camera to srgb or Adobe RGB? What happens
More informationCHAPTER-4 FRUIT QUALITY GRADATION USING SHAPE, SIZE AND DEFECT ATTRIBUTES
CHAPTER-4 FRUIT QUALITY GRADATION USING SHAPE, SIZE AND DEFECT ATTRIBUTES In addition to colour based estimation of apple quality, various models have been suggested to estimate external attribute based
More informationMicroscope anatomy, image formation and resolution
Microscope anatomy, image formation and resolution Ian Dobbie Buy this book for your lab: D.B. Murphy, "Fundamentals of light microscopy and electronic imaging", ISBN 0-471-25391-X Visit these websites:
More informationPhoto Editing Workflow
Photo Editing Workflow WHY EDITING Modern digital photography is a complex process, which starts with the Photographer s Eye, that is, their observational ability, it continues with photo session preparations,
More informationImage analysis. CS/CME/BioE/Biophys/BMI 279 Oct. 31 and Nov. 2, 2017 Ron Dror
Image analysis CS/CME/BioE/Biophys/BMI 279 Oct. 31 and Nov. 2, 2017 Ron Dror 1 Outline Images in molecular and cellular biology Reducing image noise Mean and Gaussian filters Frequency domain interpretation
More informationA Study of Slanted-Edge MTF Stability and Repeatability
A Study of Slanted-Edge MTF Stability and Repeatability Jackson K.M. Roland Imatest LLC, 2995 Wilderness Place Suite 103, Boulder, CO, USA ABSTRACT The slanted-edge method of measuring the spatial frequency
More informationTexture characterization in DIRSIG
Rochester Institute of Technology RIT Scholar Works Theses Thesis/Dissertation Collections 2001 Texture characterization in DIRSIG Christy Burtner Follow this and additional works at: http://scholarworks.rit.edu/theses
More informationSecrets of Telescope Resolution
amateur telescope making Secrets of Telescope Resolution Computer modeling and mathematical analysis shed light on instrumental limits to angular resolution. By Daniel W. Rickey even on a good night, the
More informationLaboratory 1: Uncertainty Analysis
University of Alabama Department of Physics and Astronomy PH101 / LeClair May 26, 2014 Laboratory 1: Uncertainty Analysis Hypothesis: A statistical analysis including both mean and standard deviation can
More informationT I P S F O R I M P R O V I N G I M A G E Q U A L I T Y O N O Z O F O O T A G E
T I P S F O R I M P R O V I N G I M A G E Q U A L I T Y O N O Z O F O O T A G E Updated 20 th Jan. 2017 References Creator V1.4.0 2 Overview This document will concentrate on OZO Creator s Image Parameter
More informationThe Unique Role of Lucis Differential Hysteresis Processing (DHP) in Digital Image Enhancement
The Unique Role of Lucis Differential Hysteresis Processing (DHP) in Digital Image Enhancement Brian Matsumoto, Ph.D. Irene L. Hale, Ph.D. Imaging Resource Consultants and Research Biologists, University
More information6.A44 Computational Photography
Add date: Friday 6.A44 Computational Photography Depth of Field Frédo Durand We allow for some tolerance What happens when we close the aperture by two stop? Aperture diameter is divided by two is doubled
More information1.Discuss the frequency domain techniques of image enhancement in detail.
1.Discuss the frequency domain techniques of image enhancement in detail. Enhancement In Frequency Domain: The frequency domain methods of image enhancement are based on convolution theorem. This is represented
More informationPROCEEDINGS OF SPIE. Measurement of low-order aberrations with an autostigmatic microscope
PROCEEDINGS OF SPIE SPIEDigitalLibrary.org/conference-proceedings-of-spie Measurement of low-order aberrations with an autostigmatic microscope William P. Kuhn Measurement of low-order aberrations with
More informationImage Filtering. Median Filtering
Image Filtering Image filtering is used to: Remove noise Sharpen contrast Highlight contours Detect edges Other uses? Image filters can be classified as linear or nonlinear. Linear filters are also know
More informationE X P E R I M E N T 12
E X P E R I M E N T 12 Mirrors and Lenses Produced by the Physics Staff at Collin College Copyright Collin College Physics Department. All Rights Reserved. University Physics II, Exp 12: Mirrors and Lenses
More informationDigital Image Processing
Digital Image Processing Part 2: Image Enhancement Digital Image Processing Course Introduction in the Spatial Domain Lecture AASS Learning Systems Lab, Teknik Room T26 achim.lilienthal@tech.oru.se Course
More informationCS534 Introduction to Computer Vision. Linear Filters. Ahmed Elgammal Dept. of Computer Science Rutgers University
CS534 Introduction to Computer Vision Linear Filters Ahmed Elgammal Dept. of Computer Science Rutgers University Outlines What are Filters Linear Filters Convolution operation Properties of Linear Filters
More informationUnderstanding Infrared Camera Thermal Image Quality
Access to the world s leading infrared imaging technology Noise { Clean Signal www.sofradir-ec.com Understanding Infared Camera Infrared Inspection White Paper Abstract You ve no doubt purchased a digital
More informationRefined Slanted-Edge Measurement for Practical Camera and Scanner Testing
Refined Slanted-Edge Measurement for Practical Camera and Scanner Testing Peter D. Burns and Don Williams Eastman Kodak Company Rochester, NY USA Abstract It has been almost five years since the ISO adopted
More informationThe Statistics of Visual Representation Daniel J. Jobson *, Zia-ur Rahman, Glenn A. Woodell * * NASA Langley Research Center, Hampton, Virginia 23681
The Statistics of Visual Representation Daniel J. Jobson *, Zia-ur Rahman, Glenn A. Woodell * * NASA Langley Research Center, Hampton, Virginia 23681 College of William & Mary, Williamsburg, Virginia 23187
More informationApplications of Optics
Nicholas J. Giordano www.cengage.com/physics/giordano Chapter 26 Applications of Optics Marilyn Akins, PhD Broome Community College Applications of Optics Many devices are based on the principles of optics
More informationApplying Automated Optical Inspection Ben Dawson, DALSA Coreco Inc., ipd Group (987)
Applying Automated Optical Inspection Ben Dawson, DALSA Coreco Inc., ipd Group bdawson@goipd.com (987) 670-2050 Introduction Automated Optical Inspection (AOI) uses lighting, cameras, and vision computers
More informationPoint Spread Function Estimation Tool, Alpha Version. A Plugin for ImageJ
Tutorial Point Spread Function Estimation Tool, Alpha Version A Plugin for ImageJ Benedikt Baumgartner Jo Helmuth jo.helmuth@inf.ethz.ch MOSAIC Lab, ETH Zurich www.mosaic.ethz.ch This tutorial explains
More informationINFRARED IMAGING-PASSIVE THERMAL COMPENSATION VIA A SIMPLE PHASE MASK
Romanian Reports in Physics, Vol. 65, No. 3, P. 700 710, 2013 Dedicated to Professor Valentin I. Vlad s 70 th Anniversary INFRARED IMAGING-PASSIVE THERMAL COMPENSATION VIA A SIMPLE PHASE MASK SHAY ELMALEM
More informationStudy guide for Graduate Computer Vision
Study guide for Graduate Computer Vision Erik G. Learned-Miller Department of Computer Science University of Massachusetts, Amherst Amherst, MA 01003 November 23, 2011 Abstract 1 1. Know Bayes rule. What
More informationTutorial I Image Formation
Tutorial I Image Formation Christopher Tsai January 8, 28 Problem # Viewing Geometry function DPI = space2dpi (dotspacing, viewingdistance) DPI = SPACE2DPI (DOTSPACING, VIEWINGDISTANCE) Computes dots-per-inch
More informationThermography. White Paper: Understanding Infrared Camera Thermal Image Quality
Electrophysics Resource Center: White Paper: Understanding Infrared Camera 373E Route 46, Fairfield, NJ 07004 Phone: 973-882-0211 Fax: 973-882-0997 www.electrophysics.com Understanding Infared Camera Electrophysics
More informationEC-433 Digital Image Processing
EC-433 Digital Image Processing Lecture 2 Digital Image Fundamentals Dr. Arslan Shaukat 1 Fundamental Steps in DIP Image Acquisition An image is captured by a sensor (such as a monochrome or color TV camera)
More informationISO INTERNATIONAL STANDARD. Photography Electronic still-picture cameras Resolution measurements
INTERNATIONAL STANDARD ISO 12233 First edition 2000-09-01 Photography Electronic still-picture cameras Resolution measurements Photographie Appareils de prises de vue électroniques Mesurages de la résolution
More informationROBOT VISION. Dr.M.Madhavi, MED, MVSREC
ROBOT VISION Dr.M.Madhavi, MED, MVSREC Robotic vision may be defined as the process of acquiring and extracting information from images of 3-D world. Robotic vision is primarily targeted at manipulation
More information(Quantitative Imaging for) Colocalisation Analysis
(Quantitative Imaging for) Colocalisation Analysis or Why Colour Merge / Overlay Images are EVIL! Special course for DIGS-BB PhD program What is an Image anyway..? An image is a representation of reality
More informationVery short introduction to light microscopy and digital imaging
Very short introduction to light microscopy and digital imaging Hernan G. Garcia August 1, 2005 1 Light Microscopy Basics In this section we will briefly describe the basic principles of operation and
More informationKatarina Logg, Kristofer Bodvard, Mikael Käll. Dept. of Applied Physics. 12 September Optical Microscopy. Supervisor s signature:...
Katarina Logg, Kristofer Bodvard, Mikael Käll Dept. of Applied Physics 12 September 2007 O1 Optical Microscopy Name:.. Date:... Supervisor s signature:... Introduction Over the past decades, the number
More informationUSE OF HISTOGRAM EQUALIZATION IN IMAGE PROCESSING FOR IMAGE ENHANCEMENT
USE OF HISTOGRAM EQUALIZATION IN IMAGE PROCESSING FOR IMAGE ENHANCEMENT Sapana S. Bagade M.E,Computer Engineering, Sipna s C.O.E.T,Amravati, Amravati,India sapana.bagade@gmail.com Vijaya K. Shandilya Assistant
More informationImage Capture and Problems
Image Capture and Problems A reasonable capture IVR Vision: Flat Part Recognition Fisher lecture 4 slide 1 Image Capture: Focus problems Focus set to one distance. Nearby distances in focus (depth of focus).
More informationThe predicted performance of the ACS coronagraph
Instrument Science Report ACS 2000-04 The predicted performance of the ACS coronagraph John Krist March 30, 2000 ABSTRACT The Aberrated Beam Coronagraph (ABC) on the Advanced Camera for Surveys (ACS) has
More informationOptiSpheric IOL. Integrated Optical Testing of Intraocular Lenses
OptiSpheric IOL Integrated Optical Testing of Intraocular Lenses OPTICAL TEST STATION OptiSpheric IOL ISO 11979 Intraocular Lens Testing OptiSpheric IOL PRO with in air tray on optional instrument table
More informationNature Neuroscience: doi: /nn Supplementary Figure 1. Optimized Bessel foci for in vivo volume imaging.
Supplementary Figure 1 Optimized Bessel foci for in vivo volume imaging. (a) Images taken by scanning Bessel foci of various NAs, lateral and axial FWHMs: (Left panels) in vivo volume images of YFP + neurites
More informationLane Detection in Automotive
Lane Detection in Automotive Contents Introduction... 2 Image Processing... 2 Reading an image... 3 RGB to Gray... 3 Mean and Gaussian filtering... 5 Defining our Region of Interest... 6 BirdsEyeView Transformation...
More informationABOUT RESOLUTION. pco.knowledge base
The resolution of an image sensor describes the total number of pixel which can be used to detect an image. From the standpoint of the image sensor it is sufficient to count the number and describe it
More informationChapter 12 Image Processing
Chapter 12 Image Processing The distance sensor on your self-driving car detects an object 100 m in front of your car. Are you following the car in front of you at a safe distance or has a pedestrian jumped
More informationTSBB09 Image Sensors 2018-HT2. Image Formation Part 1
TSBB09 Image Sensors 2018-HT2 Image Formation Part 1 Basic physics Electromagnetic radiation consists of electromagnetic waves With energy That propagate through space The waves consist of transversal
More informationSome of the important topics needed to be addressed in a successful lens design project (R.R. Shannon: The Art and Science of Optical Design)
Lens design Some of the important topics needed to be addressed in a successful lens design project (R.R. Shannon: The Art and Science of Optical Design) Focal length (f) Field angle or field size F/number
More information1.6 Beam Wander vs. Image Jitter
8 Chapter 1 1.6 Beam Wander vs. Image Jitter It is common at this point to look at beam wander and image jitter and ask what differentiates them. Consider a cooperative optical communication system that
More informationHow to Optimize the Sharpness of Your Photographic Prints: Part I - Your Eye and its Ability to Resolve Fine Detail
How to Optimize the Sharpness of Your Photographic Prints: Part I - Your Eye and its Ability to Resolve Fine Detail Robert B.Hallock hallock@physics.umass.edu Draft revised April 11, 2006 finalpaper1.doc
More informationThe Noise about Noise
The Noise about Noise I have found that few topics in astrophotography cause as much confusion as noise and proper exposure. In this column I will attempt to present some of the theory that goes into determining
More informationPractical Flatness Tech Note
Practical Flatness Tech Note Understanding Laser Dichroic Performance BrightLine laser dichroic beamsplitters set a new standard for super-resolution microscopy with λ/10 flatness per inch, P-V. We ll
More informationdigital film technology Resolution Matters what's in a pattern white paper standing the test of time
digital film technology Resolution Matters what's in a pattern white paper standing the test of time standing the test of time An introduction >>> Film archives are of great historical importance as they
More informationEvaluating Commercial Scanners for Astronomical Images. The underlying technology of the scanners: Pixel sizes:
Evaluating Commercial Scanners for Astronomical Images Robert J. Simcoe Associate Harvard College Observatory rjsimcoe@cfa.harvard.edu Introduction: Many organizations have expressed interest in using
More informationQUANTITATIVE IMAGE TREATMENT FOR PDI-TYPE QUALIFICATION OF VT INSPECTIONS
QUANTITATIVE IMAGE TREATMENT FOR PDI-TYPE QUALIFICATION OF VT INSPECTIONS Matthieu TAGLIONE, Yannick CAULIER AREVA NDE-Solutions France, Intercontrôle Televisual inspections (VT) lie within a technological
More informationBIG PIXELS VS. SMALL PIXELS THE OPTICAL BOTTLENECK. Gregory Hollows Edmund Optics
BIG PIXELS VS. SMALL PIXELS THE OPTICAL BOTTLENECK Gregory Hollows Edmund Optics 1 IT ALL STARTS WITH THE SENSOR We have to begin with sensor technology to understand the road map Resolution will continue
More information30 lesions. 30 lesions. false positive fraction
Solutions to the exercises. 1.1 In a patient study for a new test for multiple sclerosis (MS), thirty-two of the one hundred patients studied actually have MS. For the data given below, complete the two-by-two
More informationLane Detection in Automotive
Lane Detection in Automotive Contents Introduction... 2 Image Processing... 2 Reading an image... 3 RGB to Gray... 3 Mean and Gaussian filtering... 6 Defining our Region of Interest... 10 BirdsEyeView
More informationThe IQ3 100MP Trichromatic. The science of color
The IQ3 100MP Trichromatic The science of color Our color philosophy Phase One s approach Phase One s knowledge of sensors comes from what we ve learned by supporting more than 400 different types of camera
More informationImage Enhancement in Spatial Domain
Image Enhancement in Spatial Domain 2 Image enhancement is a process, rather a preprocessing step, through which an original image is made suitable for a specific application. The application scenarios
More informationSampling and pixels. CS 178, Spring Marc Levoy Computer Science Department Stanford University. Begun 4/23, finished 4/25.
Sampling and pixels CS 178, Spring 2013 Begun 4/23, finished 4/25. Marc Levoy Computer Science Department Stanford University Why study sampling theory? Why do I sometimes get moiré artifacts in my images?
More informationAN IMPROVED NO-REFERENCE SHARPNESS METRIC BASED ON THE PROBABILITY OF BLUR DETECTION. Niranjan D. Narvekar and Lina J. Karam
AN IMPROVED NO-REFERENCE SHARPNESS METRIC BASED ON THE PROBABILITY OF BLUR DETECTION Niranjan D. Narvekar and Lina J. Karam School of Electrical, Computer, and Energy Engineering Arizona State University,
More informationComputing for Engineers in Python
Computing for Engineers in Python Lecture 10: Signal (Image) Processing Autumn 2011-12 Some slides incorporated from Benny Chor s course 1 Lecture 9: Highlights Sorting, searching and time complexity Preprocessing
More informationImage Perception & 2D Images
Image Perception & 2D Images Vision is a matter of perception. Perception is a matter of vision. ES Overview Introduction to ES 2D Graphics in Entertainment Systems Sound, Speech & Music 3D Graphics in
More informationBackground. Computer Vision & Digital Image Processing. Improved Bartlane transmitted image. Example Bartlane transmitted image
Background Computer Vision & Digital Image Processing Introduction to Digital Image Processing Interest comes from two primary backgrounds Improvement of pictorial information for human perception How
More informationChapter 3. Study and Analysis of Different Noise Reduction Filters
Chapter 3 Study and Analysis of Different Noise Reduction Filters Noise is considered to be any measurement that is not part of the phenomena of interest. Departure of ideal signal is generally referred
More informationImage Enhancement in spatial domain. Digital Image Processing GW Chapter 3 from Section (pag 110) Part 2: Filtering in spatial domain
Image Enhancement in spatial domain Digital Image Processing GW Chapter 3 from Section 3.4.1 (pag 110) Part 2: Filtering in spatial domain Mask mode radiography Image subtraction in medical imaging 2 Range
More informationVision. The eye. Image formation. Eye defects & corrective lenses. Visual acuity. Colour vision. Lecture 3.5
Lecture 3.5 Vision The eye Image formation Eye defects & corrective lenses Visual acuity Colour vision Vision http://www.wired.com/wiredscience/2009/04/schizoillusion/ Perception of light--- eye-brain
More informationNikon Instruments Europe
Nikon Instruments Europe Recommendations for N-SIM sample preparation and image reconstruction Dear customer, We hope you find the following guidelines useful in order to get the best performance out of
More information