Imaging Particle Analysis: The Importance of Image Quality

Similar documents
Sharpness, Resolution and Interpolation

Using Optics to Optimize Your Machine Vision Application

Optical Performance of Nikon F-Mount Lenses. Landon Carter May 11, Measurement and Instrumentation

Be aware that there is no universal notation for the various quantities.

An Efficient Color Image Segmentation using Edge Detection and Thresholding Methods

On spatial resolution

Determining MTF with a Slant Edge Target ABSTRACT AND INTRODUCTION

CoE4TN4 Image Processing. Chapter 3: Intensity Transformation and Spatial Filtering

Frequency Domain Enhancement

Image analysis. CS/CME/BioE/Biophys/BMI 279 Oct. 31 and Nov. 2, 2017 Ron Dror

Linear Gaussian Method to Detect Blurry Digital Images using SIFT

Criteria for Optical Systems: Optical Path Difference How do we determine the quality of a lens system? Several criteria used in optical design

Experiment 1: Fraunhofer Diffraction of Light by a Single Slit

Topic 6 - Optics Depth of Field and Circle Of Confusion

loss of detail in highlights and shadows (noise reduction)

Optical basics for machine vision systems. Lars Fermum Chief instructor STEMMER IMAGING GmbH

Optical design of a high resolution vision lens

Computer Vision. Howie Choset Introduction to Robotics

8.2 IMAGE PROCESSING VERSUS IMAGE ANALYSIS Image processing: The collection of routines and

Exercise questions for Machine vision

Defense Technical Information Center Compilation Part Notice

Evaluation of image quality of the compression schemes JPEG & JPEG 2000 using a Modular Colour Image Difference Model.

NON UNIFORM BACKGROUND REMOVAL FOR PARTICLE ANALYSIS BASED ON MORPHOLOGICAL STRUCTURING ELEMENT:

Digital Camera Technologies for Scientific Bio-Imaging. Part 2: Sampling and Signal

LAB MANUAL SUBJECT: IMAGE PROCESSING BE (COMPUTER) SEM VII

Astigmatism Particle Tracking Velocimetry for Macroscopic Flows

IMAGE SENSOR SOLUTIONS. KAC-96-1/5" Lens Kit. KODAK KAC-96-1/5" Lens Kit. for use with the KODAK CMOS Image Sensors. November 2004 Revision 2

Application Note #548 AcuityXR Technology Significantly Enhances Lateral Resolution of White-Light Optical Profilers

Education in Microscopy and Digital Imaging

Acoustic resolution. photoacoustic Doppler velocimetry. in blood-mimicking fluids. Supplementary Information

Migration from Contrast Transfer Function to ISO Spatial Frequency Response

CCD Automatic Gain Algorithm Design of Noncontact Measurement System Based on High-speed Circuit Breaker

Fig Color spectrum seen by passing white light through a prism.

ECEN 4606, UNDERGRADUATE OPTICS LAB

Chapter 2 Fourier Integral Representation of an Optical Image

in association with Getting to Grips with Printing

CHAPTER-4 FRUIT QUALITY GRADATION USING SHAPE, SIZE AND DEFECT ATTRIBUTES

Microscope anatomy, image formation and resolution

Photo Editing Workflow

Image analysis. CS/CME/BioE/Biophys/BMI 279 Oct. 31 and Nov. 2, 2017 Ron Dror

A Study of Slanted-Edge MTF Stability and Repeatability

Texture characterization in DIRSIG

Secrets of Telescope Resolution

Laboratory 1: Uncertainty Analysis

T I P S F O R I M P R O V I N G I M A G E Q U A L I T Y O N O Z O F O O T A G E

The Unique Role of Lucis Differential Hysteresis Processing (DHP) in Digital Image Enhancement

6.A44 Computational Photography

1.Discuss the frequency domain techniques of image enhancement in detail.

PROCEEDINGS OF SPIE. Measurement of low-order aberrations with an autostigmatic microscope

Image Filtering. Median Filtering

E X P E R I M E N T 12

Digital Image Processing

CS534 Introduction to Computer Vision. Linear Filters. Ahmed Elgammal Dept. of Computer Science Rutgers University

Understanding Infrared Camera Thermal Image Quality

Refined Slanted-Edge Measurement for Practical Camera and Scanner Testing

The Statistics of Visual Representation Daniel J. Jobson *, Zia-ur Rahman, Glenn A. Woodell * * NASA Langley Research Center, Hampton, Virginia 23681

Applications of Optics

Applying Automated Optical Inspection Ben Dawson, DALSA Coreco Inc., ipd Group (987)

Point Spread Function Estimation Tool, Alpha Version. A Plugin for ImageJ

INFRARED IMAGING-PASSIVE THERMAL COMPENSATION VIA A SIMPLE PHASE MASK

Study guide for Graduate Computer Vision

Tutorial I Image Formation

Thermography. White Paper: Understanding Infrared Camera Thermal Image Quality

EC-433 Digital Image Processing

ISO INTERNATIONAL STANDARD. Photography Electronic still-picture cameras Resolution measurements

ROBOT VISION. Dr.M.Madhavi, MED, MVSREC

(Quantitative Imaging for) Colocalisation Analysis

Very short introduction to light microscopy and digital imaging

Katarina Logg, Kristofer Bodvard, Mikael Käll. Dept. of Applied Physics. 12 September Optical Microscopy. Supervisor s signature:...

USE OF HISTOGRAM EQUALIZATION IN IMAGE PROCESSING FOR IMAGE ENHANCEMENT

Image Capture and Problems

The predicted performance of the ACS coronagraph

OptiSpheric IOL. Integrated Optical Testing of Intraocular Lenses

Nature Neuroscience: doi: /nn Supplementary Figure 1. Optimized Bessel foci for in vivo volume imaging.

Lane Detection in Automotive

ABOUT RESOLUTION. pco.knowledge base

Chapter 12 Image Processing

TSBB09 Image Sensors 2018-HT2. Image Formation Part 1

Some of the important topics needed to be addressed in a successful lens design project (R.R. Shannon: The Art and Science of Optical Design)

1.6 Beam Wander vs. Image Jitter

How to Optimize the Sharpness of Your Photographic Prints: Part I - Your Eye and its Ability to Resolve Fine Detail

The Noise about Noise

Practical Flatness Tech Note

digital film technology Resolution Matters what's in a pattern white paper standing the test of time

Evaluating Commercial Scanners for Astronomical Images. The underlying technology of the scanners: Pixel sizes:

QUANTITATIVE IMAGE TREATMENT FOR PDI-TYPE QUALIFICATION OF VT INSPECTIONS

BIG PIXELS VS. SMALL PIXELS THE OPTICAL BOTTLENECK. Gregory Hollows Edmund Optics

30 lesions. 30 lesions. false positive fraction

Lane Detection in Automotive

The IQ3 100MP Trichromatic. The science of color

Image Enhancement in Spatial Domain

Sampling and pixels. CS 178, Spring Marc Levoy Computer Science Department Stanford University. Begun 4/23, finished 4/25.

AN IMPROVED NO-REFERENCE SHARPNESS METRIC BASED ON THE PROBABILITY OF BLUR DETECTION. Niranjan D. Narvekar and Lina J. Karam

Computing for Engineers in Python

Image Perception & 2D Images

Background. Computer Vision & Digital Image Processing. Improved Bartlane transmitted image. Example Bartlane transmitted image

Chapter 3. Study and Analysis of Different Noise Reduction Filters

Image Enhancement in spatial domain. Digital Image Processing GW Chapter 3 from Section (pag 110) Part 2: Filtering in spatial domain

Vision. The eye. Image formation. Eye defects & corrective lenses. Visual acuity. Colour vision. Lecture 3.5

Nikon Instruments Europe

Transcription:

Imaging Particle Analysis: The Importance of Image Quality Lew Brown Technical Director Fluid Imaging Technologies, Inc. Abstract: Imaging particle analysis systems can derive much more information about particles compared to classical volumetric-based systems (light obscuration, electrozone sensing, laser diffraction, etc.), which can only measure particle size in terms of the Equivalent Spherical Diameter (ESD) of a sphere corresponding to the derived volume of the particle. Imaging-based systems can also measure many different parameters for each particle which represent both shape and gray scale information. This gives these systems the capability to differentiate amongst non-similar particles contained in a heterogeneous mixture. These image-based systems can classify particles into different types where the volumetric systems can only report a size for each particle. One drawback to imaging particle analysis systems is that they are limited in the lower end of the size range they can properly characterize (1). However, in applications within these systems' resolution range, they do offer significant advantages. This paper will discuss how the quality of the images directly affects the ability of the system to make proper measurements, and therefore affects the ability of the system to differentiate between different particle types. Using real-world data and examples, we come to the succinct (but rather unscientific) conclusion: Fuzzy Images = Fuzzy Measurements = Fuzzy Classifications I. What is Image Quality Everyone has a basic understanding of the term image quality that is purely subjective, or qualitative in nature. We make qualitative judgements on image quality all the time, especially when it comes to our own pictures taken using consumer cameras, etc. Figure One shows an example of a simple landscape image where one version of the image is clearly of higher quality than the other. The lower image is perceived as poorer quality due to its lack of sharpness and improper color balance. It is crucially important in any discussion of image quality to recognize that many factors that contribute to image quality can only be looked at from a systems level. As an example, the quality of an image produced by a digital camera is the sum of several parts, the lens ability to correctly transmit the information, the sensor s resolution and the noise and processing characteristics of the camera electronics. Furthermore, if the image produced by this camera is being judged for quality by a human being, then the characteristics of the observer also come into play: what is the quality of the person s eyesight (do they need corrective lenses, and if so, are they wearing them?), the distance to the image he/she Figure 1: Landscape Image with good image quality (top) and bad image quality (bottom). is viewing, the lighting conditions, etc. Last, but not least, in the case of a human observer, the situation becomes even muddier due to the fact that each person s brain processes the visual information differently! Some of the common descriptors used when discussing image quality include sharpness (amount of detail conveyed), dynamic range (range of light levels captured), contrast, distortion (usually caused by the optics), tonal mapping and artifacts (usually produced by the electronics or software processing of the image). Of all of these, the factor that has 1

the largest influence on perceived image quality is usually sharpness. Although this is sometimes simplified to focus, it really is a combination of many factors (focus being only one) that indicate the overall amount of spatial information captured by the image. Due to the high degree of variability of subjective image quality, scientists have found it necessary to come up with qualitative measurements for image quality. Usually, these measurements are systems level as discussed above, except in some cases where a particular element of an optical system can be isolated, such as the lens only. II. Qualitative Measures of Image Quality As stated above, the most critical factor in image quality is sharpness, or the ability of the system to capture image detail. If an image is blurry or out of focus to begin with, other factors such as tonal mapping, color balance and dynamic range really do not have as much meaning because of the lack of information contained in the image to begin with. Sharpness is most commonly equated to the term resolution, which can be a bit of a misnomer and cause for confusion, since there are many different kinds of resolution (sensor, spectral, display, etc.). However, in general, the resolution of a system is defined as the ability to reproduce spatial detail in the object being imaged (2). In systems that include the human eye, resolution is a combination of sharpness and acutance. Acutance is a perceived sharpness caused by the changes in brightness on edge transitions (3). Those familiar with Photoshop use operations such as unsharp masking and sharpening frequently; these operations increase perceived sharpness through acutance changes only, the actual resolution of the image is unchanged. As an example, the lower image in Figure 1 was produced merely by applying a blurring filter and lowering the image acutance, but it still contains the same resolution (in pixels) as the original image! image a standardized test target consisting of groups of bars of increasing numbers of line pairs/millimeter, and the largest bars that the imaging system can not discern are considered the limit of the system s resolving power. One of the most commonly used targets is the 1951 USAF Resolution test Target (Figure 2). The problem with merely using lp/mm to measure sharpness is that there is still a subjective judgment left in terms of the word discern in the definition above. To overcome this limitation, a measurement called Modulation Transfer Function (MTF) was introduced that measures the ability of the system to transfer contrast in the frequency domain Figure 2: USAF 1951 Resolution Test Target (5). The modulation (or variance) of contrast transmitted by the system from target to image, ranging from 100% (unattenuated) to 0% (completely attenuated), is measured over a varying range of spatial frequencies and plotted (5). A typical graph of MTF is shown in Figure 3. The modulation of contrast is measured by using the system to image a target such as the USAF target above, or more commonly a sinusoidal (in intensity) target of varying frequency, and measuring the transmitted contrast versus the original on the target. The commonly accepted modulation representing the limit of a system s ability to transfer information is 9%, which is also known as the Rayleigh diffraction limit (5). Since imaging particle analysis involves measurements made on images by a computer, we need not worry about acutance in this discussion, as it is a perceived-only (via the human eye) characteristic. However, it is important to remember that when any of these images are viewed by the human eye, the subjectivity of vision and acutance can cause confusion! Quantitative measurement of image sharpness is typically expressed in terms of the ability of the system to distinguish closely-spaced, high contrast pairs of lines in a given distance, most typically as line pairs per millimeter or lp/mm (4). In order to measure this, the system being evaluated is used to Figure 3: Typical MTF Plot (6) 2

A final common, related descriptor for image sharpness is the Point Spread Function (PSF), which can be thought of as the spatial domain equivalent of MTF (7). The PSF is the intensity response of the system when it images a point light source at infinity (i.e. perfectly focused). If the system was "perfect", the image would also be a point. However, diffraction and other factors cause the image of the point to be "spread" out, and in fact diffraction will cause intensity "rings" to emanate from the center. This representation is also referred to as the "airy disk". A 3-D plot and 2-D intensity image of the airy disk is shown in Figure 4 (8). Figure 5: Particle images of 10µm calibrated beads, "sharp" focus (left) and "blurry" focus (right). (NOTE: both images have extra blur caused by pixel-replicated zoom necessary to display for print). Figure 4: Airy disk representation of Point Spread Function (PSF) showing 3-D and 2-D intensity plots. (8) III. Variation in Image Measurements Caused by Varying Image Quality In imaging particle analysis, the measurements made on the particles are done on a thresholded binary image rather than the original gray-scale image (9). This is done by choosing a gray-scale threshold value and then declaring each pixel as particle or not particle based upon the actual particle image pixel gray-scale value compared to the threshold. Figure 5 shows two particle images of a 10µm calibration bead, one in focus and the other out of focus. It is important to note that these images have "artificial blur" in this diagram caused by having to upsample the original 72ppi screen image to a print resolution of 300ppi. An intensity profile can be generated for each by making a line across the particle (passing through the center) and then plotting distance versus intensity. Figure 6 shows the result of doing this for the two particles in Figure 5. As can be seen in Table 1, based on information from Figure 6, if both images are thresholded identically at an intensity value of 150, the sharp particle would have a diameter of 10.13µm, whereas the blurry particle would have a diameter of 10.96µm. Table 1 shows how the diameter for each particle would vary based upon a different threshold value. Note that the ESD varies by a total of 12.86µm between a threshold value of 100 and 200 for the blurry image, whereas the ESD only varies by a total of 1.67µm over the same range. Figure 6: "Intensity profile plots" for the two images in Figure 5. Upper plot is for the "sharp" image, lower plot is for the "blurred" image. Threshold Value Equivalent Spherical Diameter (ESD in µm) "Sharp" Image "Blurry" Image 100 9.33 4.69 125 9.72 7.80 150 10.13 10.96 175 10.49 14.64 200 11.00 17.55 Table 1: ESD value for varying threshold levels on the two particles shown in Figure 5. 3

IV. Example of the Effect of Image Quality on Particle Measurements In order to concretely demonstrate how varying image quality affects measurements in imaging particle analysis, a simple experiment was run. Fluid Imaging Technologies FlowCAM was used to perform the experiment, where NIST traceable calibrated size beads were imaged by the system. The system was set up to use the 10X objective lens (which yields an overall system magnification 100X), and 10µm diameter beads. For these parameters, the FlowCAM would normally be set up using a 100µm (depth) flow cell in order to keep the particles in sharp focus. However, in this instance, a 300µm (depth) flow cell was used to insure that some of the particles imaged would be out of focus. One of the many measurements made by the FlowCAM s VisualSpreadsheet software is edge gradient. Edge gradient essentially measures the sharpness of edges in the image by looking at the slope of the gray scale change on an edge. High edge gradients indicate sharp edges, while low edge gradients indicate fuzzy edges. If the target is a high-contrast object, such as the beads in this experiment, then the particles with high edge gradient are seen as being in sharp focus whereas the ones with low edge gradient are seen as blurry. So the edge gradient measurement can be used to quickly separate high quality (sharp) bead images from low quality (blurry) bead images. After running the beads through the FlowCAM several times in this configuration, the diameter versus frequency graphs clearly showed a bimodal distribution for size: one peak at 10µm representing the beads in sharp focus, and another (wider) peak centered around 16µm representing the out-offocus beads. Using the VisualSpreadsheet software, one can quickly isolate the best quality images from each peak by merely selecting each peak and then sorting based upon edge gradient. Figure 5 shows the results from doing this, and clearly demonstrates the measurement change that occurs when the beads are not in sharp focus. The mean Equivalent Spherical Diameter (ESD) for the in-focus beads was 9.81µm, whereas the mean ESD for the out-of-focus beads was 15.64µm. Figure 5: Comparison of bead images and corresponding summary statistics from lower peak (left) and upper peak (right). The mean Equivalent Spherical Diameter (ESD) for the in-focus beads was 9.81µm, whereas the mean ESD for the out-of-focus beads was 15.64µm. 4

Figure 6: Close up view of 2 bead image sets (left), and with binary overlay (right). Note the higher degree of variability, both in size and in shape on the out-of-focus beads (right hand side of each screen shot). One might make the argument that this result is somewhat "rigged", since the FlowCAM's distance calibration was based upon sharply focused beads. However, one only needs to look at the standard deviation and coefficient of variability (CV) for the two data sets to see that the mean is only part of the story. The out-of-focus particles exhibit a significantly higher standard deviation (0.56µm versus 0.14µm) and CV (3.57% versus 1.46%) than do the in-focus particles. To put this in rather succinct (but unscientific) terms: Fuzzy Images = Fuzzy Measurements V. How Particle Measurements Affect Particle Classification As shown in the example above, image quality (in this case specifically, sharpness ) has a direct relationship to the quality and precision of measurements made on these particles from the images. The efficacy of any type of particle classification, whether value or statistical based (9), is going to be dependent on the quality of the measurements, and therefore directly dependent upon the original image quality produced by the system. In either type of classification, the measurements are used for the basis of characterizing the particle images. So, if we take the above example to its next logical step, we can assume that image quality will directly affect the ability for pattern recognition software to properly classify particles. To demonstrate this, another experiment was run using the FlowCAM. This time, rather than using beads, we wanted to use a real world sample containing non-spherical particles that vary slightly in shape, size, transparency and other attributes. A culture of the phytoplankton species Cosmarium Figure 7: Typical in-focus images of Cosmarium algae as captured by the FlowCAM. was used. This species has a fairly distinctive shape as can be seen from the sample images in Figure 7. The average size of these particles is around 50µm, so they are normally run in the FlowCAM using the 4X objective (approximately 40X overall magnification) with a 300µm (depth) flow cell. As before, in order to get particles both in and out of focus, the flow cell used was larger than normal (600µm in depth). After the sample had been run, two image libraries were built by selecting 8 Cosmarium algae images that represented typical in-focus images (Figure 7) and 8 that represented Figure 8: Typical out-of-focus images of Cosmarium algae as captured by the FlowCAM. 5

blurry images (Figure 8). These library images are then used as the basis for a statistical pattern recognition algorithm (9) whereby each particle in the run is compared statistically against the library particles to determine how closely they match the target particles. Figure 9 shows the overall results for one of the FlowCAM runs made with the Cosmarium algae culture. The left side window shows the summary statistics for the run: 4,500 particle images were collected and stored yielding a mean volume-weighted ESD of 52.91µm and a mean aspect ratio (width/length) of 0.69. The right side window shows a random sampling of the particle images themselves; you can clearly see the variations in shape and size caused both by the organic variation of the species itself and due to the variation in focus through the flow cell depth. Figure 9: Summary results of FlowCAM run using Cosmarium culture. Figure 10: Typical images and summary statistics for particles found with the "sharp" image library. Figure 11: Typical images and summary statistics for particles found with the "blurry" image library. 6

At this point, the two statistical filters created from the libraries show in Figures 7 and 8 were run against the entire run of data (separately, not at the same time). Figure 10 shows some of the particle images found using the sharp library, along with the summary statistics for the filter. Figure 11 shows the same results when using the blurry library. Out of the 4,500 original particles, the sharp filter found 230 like particles representing a concentration of 1,029 particles/ ml, whereas the blurry filter only found 44 like particles representing a concentration of only 197 particles/ml. So, the sharp filter found an order of magnitude more particles than the blurry filter. This is due to the fact that the sharp library has tighter measurements, forming a tighter cluster in the n-dimensional pattern recognition space (9), thereby more specifically defining the shape and gray-scale attributes of the desired particles. The blurry library has a much looser, more ambiguous cluster to it due to the larger variance in all the measurements. This particular case shows a very simplistic situation where only a single particle type is present in the sample. In the real world, such as when studying a water sample with multiple algal species present, or a protein sample containing agglomerated proteins, silicone droplets and other foreign matter, the importance of sharp images in the libraries becomes greater. In these applications, the filters will be run at the same time and be mutually exclusive (a particle should not belong to two different types ). Not only will blurry filter images cause undercounting as in the example just shown, but they also will lead to false positives and false negatives in each class, thereby making the classification less accurate and less repeatable. To further expound on the succinct (but unscientific) conclusion found in section IV: Fuzzy Images = Fuzzy Measurements = Fuzzy Classifications VI. Conclusions While image quality can sometimes be a rather fleeting concept in the realm of the qualitative (subjective), digital image processing does not suffer from any ambiguity in this arena. This is because digital images are purely mathematical constructs, and thus can be evaluated in mathematical, quantitative terms. Since imaging particle analysis often deals with microscopic particles, it frequently will run into limitations based purely on resolution (1). Even when the particles are within the range that a particular imaging particle analysis system can resolve well, the quality of the image still greatly affects the ability of a system to classify particles of different types (let alone properly measure them!). Through two real-world examples, this paper has shown that image quality, in particular sharpness, greatly affects measurement quality and therefore the system s ability to properly classify particles of different types. Since the true strength of imaging particle analysis systems over volumetric-based systems (which can only measure particle size) is their ability to differentiate particles based upon shape and gray-scale, image quality is paramount to the value of one of these systems. As previously summarized: Fuzzy Images = Fuzzy Measurements = Fuzzy Classifications VII. References 1.) "Imaging Particle Analysis: Resolution and Sampling Considerations", Lew Brown, Fluid Imaging Technologies Web Site: http://fluidimaging.com/imaging-particle-analysiswhite-papers.aspx 2.) Wikipedia entry on Optical Resolution: http://en.wikipedia.org/wiki/optical_resolution 3.) Wikipedia entry on Acutance: http://en.wikipedia.org/wiki/acutance 4.) "Understanding Sharpness", Michael Reichmann, Luminous Landscape Web Site: http://www.luminouslandscape.com/tutorials/sharpness.shtml 5.) "Understanding image sharpness part 1: Introduction to resolution and MTF curves", Norman Koran, tutorial on web site: http://www.normankoren.com/tutorials/mtf.html 6.) MTF Plot from: http://www.edmundoptics.com/ technical-support/optics/modulation-transfer-function/ 7.) Wikipedia entry on Point Spread Function: http:// en.wikipedia.org/wiki/point_spread_function 8.) Wikipedia entry on Airy Disk: http://en.wikipedia.org/ wiki/airy_disk 9.) "Particle Image Understanding - A Primer", Lew Brown, Fluid Imaging Technologies Web Site: http://fluidimaging. com/imaging-particle-analysis-white-papers.aspx 7