Flat field correction for high-throughput imaging of fluorescent samples

Size: px
Start display at page:

Download "Flat field correction for high-throughput imaging of fluorescent samples"

Transcription

1 Journal of Microscopy, Vol. 263, Issue , pp Received 16 March 2015; accepted 16 February 2016 doi: /jmi Flat field correction for high-throughput imaging of fluorescent samples PEET KASK, KAUPO PALO, CHRIS HINNAH & THORA POMMERENCKE PerkinElmer Cellular Technologies Germany GmbH, Hamburg, Germany Key words. Flat field, fluorescence microscopy, intensity variations, shading correction, vignetting. Summary Vignetting of microscopic images impacts both the visual impression of the images and any image analysis applied to it. Especially in high-throughput screening high demands are made on an automated image analysis. In our work we focused on fluorescent samples and found that two profiles (background and foreground) for each imaging channel need to be estimated to achieve a sufficiently flat image after correction. We have developed a method which runs completely unsupervised on a wide range of assays. By adding a reliable internal quality control we mitigate the risk of introducing artefacts into sample images through correction. The method requires hundreds of images for the foreground profile, thus limiting its application to high-throughput screening where this requirement is fulfilled in routine operation. Introduction Images acquired by a microscope are usually affected by shading (or vignetting), that is, the measured intensity of an object depends on coordinates of the object in the field of view; namely pixels in corners of the image tend to be darker than those in the centre. A similar phenomenon is also known in general photography (e.g. Varga et al., 2004; Goldman & Chen, 2005) and in astronomy(e.g. Xue& Shi, 2008). However, the detailed characteristics of shading are critically dependent on the specific optical path and thus rules and conclusions cannot easily be transferred between the different fields. In this publication we present a fully automatic shading correction applicable to fluorescent microscopy images of cellular samples acquired in a high-throughput mode. On microscopy images the shading usually leads to a roughly centred intensity drop from centre to corner. A high number of different physical mechanisms contribute to the overall effect: nonuniform intensity of the incident light, Correspondence to: Peet Kask, PerkinElmer Cellular Technologies Germany GmbH Eesti filiaal, Vabaõhumuuseumi tee 2a-29, Tallinn, Estonia. Tel: ; peet.kask@perkinelmer.com nonuniform angular sensitivity of detectors, radial fall-off of light collection efficiency of lenses (Goldman & Chen, 2005), nonidealities in microscope optical details and adjustment (Tomaževič et al., 2002; Piccinini et al., 2012). Correction of shading may be motivated simply by the impaired appearance of the images the intensity variations become most obvious when stitching different fields of view of a sample (e.g. Gareau et al., 2009; Bevilacqua et al., 2011). However, the impact of shading on different stages of data processing, for example, segmentation and especially quantification is also severe. Thus many software packages include an image processing procedure to compensate the coordinatedependent intensity variations. A high number of approaches of shading correction have been developed and described. They have in common that always at least one function of image coordinates is involved in the model of shading. This function may be called shading function, vignetting function or intensity profile. A prerequisite of a successful correction is that the shading function (or functions) is adequately estimated. A basis of classification of the different flat field correction approaches is the requirement for calibration samples. Some approaches use calibration samples when estimating shading functions (Salmon et al., 1998; Model & Burkhardt, 2001), whereas others derive the functions directly from the sample images (Likar et al., 2000; Leong et al., 2003; Bevilacqua et al., 2011; Lee et al., 2014; Smith et al., 2015). In the first case, the approach is called prospective shading correction whereas in the latter case it is a retrospective one. Approaches of retrospective correction are attractive as they do not require the cumbersome step of calibration image acquisition using special samples; rather, the correction functions are directly estimated from the images of interest thus ensuring identical optical settings for the shading functions and experiment. Another basis of classification of the methods for shading correction is the model used to represent the shading function. It can be distinguished between methods using a parametric model of shading function (Likar et al., 2000; Young, 2000), and those which are parameter-free (Model & Burkhardt, C 2016The Authors.Journal of Microscopy published by John Wiley & Sons Ltd on behalf of Royal Microscopical Society. This is an open access article under the terms of the Creative Commons Attribution-NonCommercial-NoDerivs License, which permits use and distribution in any medium, provided the original work is properly cited, the use is non-commercial and no modifications or adaptations are made.

2 FLAT FIELD CORRECTION ; Leong et al., 2003; Bevilacqua et al., 2011; Gherardi et al., 2011; Lee et al., 2014; Smith et al., 2015). Still another approach to classify flat field correction methods is via the number of shading functions used. Simpler correction algorithms work with a single shading function (Leong et al., 2003; Jones et al., 2006; Bevilacqua et al., 2011; Gherardi et al., 2011; Piccinini et al., 2012; Lee et al., 2014) whereas more sophisticated approaches require estimation of two functions additive and multiplicative ones (Likar et al., 2000; Young, 2000; Tomaževič et al., 2002; Smith et al., 2015). In fact, the approaches with a single shading function have been considered as special cases of the approach with two shading functions where one out of the two is ignored (Tomaževič et al., 2002) and either an additive or a multiplicative correction is applied. According to the model involving both additive and multiplicative modulating functions, image formation can be described by the following expression (e.g. Likar et al., 2000): N(x, y) = U (x, y) S M (x, y) + S A (x, y). (1) Here N(x, y) is the measured image, U (x, y) is the true image, S M (x, y) is the multiplicative shading component (usually normalized) describing coordinate-dependent modulation of the true image by the microscope and S A (x, y) is the additive (background) shading component, which is also a function of image coordinates, but generally of a different shape than the multiplicative profile. Correction against shading is, in other words, the restoration of the true image based on estimation of the two shading components and can be expressed as Û (x, y) = N(x, y) Ŝ A (x, y) Ŝ M (x, y). (2) Different hypothesis exist with regard to the origin of the additive shading component. In a number of publications, the additive shading component has been called black image considered to be adequately measured without any illumination of the sample (Young, 2000; Varga et al., 2004). However, in fluorescence microscopy of cells another interpretation seems more adequate: the additive profile is considered to characterize background, the dark signal being only one component contributing to it (Schwarzfischer et al., 2011), whereas the multiplicative profile characterizes foreground, that is, the actual objects in the image. The background image can be directly measured using empty images of the culture medium (Piccinini et al., 2013), but is also visible in the sample images in regions with no foreground objects (Lee et al., 2014). The background profile is a normalized background image (usually with mean intensity 1.0) characterizing its shape. If optical adjustment of the microscope is reasonable then the background profile is above average in the centre of the image and drops smoothly with distance from the image centre. The background profile is reproduced surprisingly accurately from image to image in different wells and different fields of view. Measuring the foreground profile is much more difficult as the smooth illumination profile introduced by the optical system is superimposed by inter- and intraobject intensity variations. Objects differ in intensity due to location and the amount of marker they are carrying. Furthermore, there is also intraobject variation of the intensity in different regions of the foreground objects. Shifting the object in the field of view (e.g. by moving the object holder under the microscope) is a straightforward way to illustrate the dependency of intensity on location. The dependency of the measureable intensity of an object as a function of image coordinates is the foreground profile. In fact, the foreground profile can be estimated via acquiring many images with shifting the sample holder in small steps in both directions (Piccinini et al., 2013). It is simplistic to assume that background and foreground profiles are identical. The background intensity profile is partly (or even dominantly) formed by emission from regions which are far out of the focal plane, whereas the foreground profile characterizes emission of in-focus objects. Our studies have shown that the estimated background and foreground profiles are always significantly different. An elegant method of estimating both shading components (without specifying their physical meaning) is via parametric approximation of the two smoothly varying shading components and estimating the parameters via entropy minimization (Likar et al., 2000). This approach belongs to the class of retrospective shading corrections requiring neither special calibration samples nor calibration measurements. Recently, another retrospective shading correction method has been published which estimates two shading functions without assuming any parametric models (Smith et al., 2015). The smoothness of the profile functions is granted by regularization. A restriction of this approach is in the assumption that objects may appear anywhere on the image with equal probability. We have discovered that in real-life applications the density of cells is often higher near well borders than in the centre. The method introduced here has many similarities with that of Likar et al. (2000) or Smith et al. (2015): belonging to the retrospective class (i.e. not using calibration images) and estimating two shading functions. But contrary to Likar et al. (2000) and Smith et al. (2015), a segmentation step is used which enables to accumulate data about background and foreground profiles separately and then estimate the two profiles one after the other. This significantly improves robustness of data analysis. In summary, segmentation is a step of crucial importance towards enabling automated estimation of background and foreground profiles without any user intervention. Segmentation has rarely been used in estimation of shading profiles; a reason is that it is computationally expensive. Furthermore, it suffers from a chicken-and-egg-problem as it is difficult to design a robust and accurate segmentation before the correction profiles are known. However, segmentation-free C 2016 TheAuthors.Journal of Microscopy published by John Wiley & Sons Ltd on behalf of Royal Microscopical Society., 263,

3 330 P. KASK ET AL. approaches usually involve more additional assumptions than segmentation approaches, such as the mathematical model of profiles, or an even distribution of objects in the field of view. By using the sample images itself, no time-consuming calibration procedure is required which commonly also lacks the ability to approximate the foreground profile. Another characteristic of our approach is that we take advantage of the high number of sample images that are available in one high-content screening experiment. This abundance of sample images enables built-in quality control and outlier filtering, which is crucial for a fully automated procedure. Profile estimation is finished when random error has decreased below a wanted level. If the required level could not be reached, the respective correction method usually the more complex foreground correction is not applied, thus assuring the high quality of the automated and unsupervised approach. Materials and methods In this section, we will describe steps of a fully automated approach for flat field correction which involves estimating both background and foreground profiles from regular sample images. The basis for the procedure outlined below is the formula for applying the shading correction. We will thus start describing how the estimated profiles will be applied, that is, how they convert the measured images into flat field images. Shading correction formulae The basic shading correction formula has been presented already in the introduction (Eq. (2)). When assuming that the estimated profiles, background and foreground, are normalized images with mean 1.0, Eq. (2) must be modified as follows: Û (x, y) = N(x, y) ˆBŜ b (x, y) + B, (3) Ŝ f (x, y) where ˆB is the estimated mean background intensity of the image to be corrected, Ŝ b (x, y) is the normalized background profile image and Ŝ f (x, y) is the normalized foreground profile image. In this formula, correction for the background profile is additive whereas correction for the foreground profile is multiplicative. Compared to Eq. (2), there is a minor modification: the constant background term is added to the right-hand side of Eq. (2). This usually grants that the corrected image is positive in all pixels, but it also restores the original background intensity level which may be informative in some circumstances. Steps of estimation of background and foreground profiles In Figure 1, the data flow for estimating shading profiles is presented. Each image is segmented, that is, regions which securely belong to background and those which are representative examples of foreground are detected. The segmentation steps are described in detail in the next two sections below. The segmentation step returns two images: a background and a foreground image. Pixels outside the detected region are of undefined content (represented by NaN s) on these images. Next, both of the images are binned into images of size pixels. The binned images are accumulated (kept in RAM) for further processing. Routinely, we discard background images with detected background region less than 15%. (This stage may in principle cause failures in estimating the background profile, but in reality such cases are very rare. Furthermore, there can be other reasons of failure in estimating one or the other profile; in such cases the built-in quality control prevents that the user can apply profiles below the quality limit thus preventing any deterioration of images.) For estimation of the background profile, accumulation of sufficiently many background images is needed, enabling the following steps of processing see left column of the data flow chart. First, outlier images are detected and discarded. For that purpose, the set of images are globally fitted and each individual image is characterized by its mean square deviation from the global fit image. If one or more images are characterized by abnormally high deviations then these are discarded from the set. If any image has been discarded then global fitting and detection of outliers is repeated, until no images are found to be outliers any more. Thereafter, the result of global fitting is considered the estimated profile. To determine the statistical error of the estimated profile the set of images is divided into a number of subrealizations which are fitted separately. The variation of the resulting profiles describes the statistical error. If the statistical error is below a pregiven level then profile estimation is finished, otherwise more images will be accumulated. The statistic which is estimated is expected top to bottom amplitude of random deviations of the estimated profile from the expected one. One may call it the range of random error, or in short, random error. The estimation is completed if the range of random error is below predefined limit, for example, below 2%. Figure 2 illustrates some steps of background profile estimation. The statistic which we have called the range of random error is useful since it also characterizes non flatness of the image after correction. If a precise profile is used for correction then the corrected image is perfectly flat. If the profile is wrong by, for example, 1% at certain image coordinates then the corrected image deviates from the perfectly flat image by 1% in the opposite direction. Thus, on a good approximation, the expected non flatness of the image after correction is equal to the range of random error of the estimated profile. Data for foreground estimation are preprocessed and accumulated in parallel with background data, see right column of the data flow chart. However, only when estimation of the background profile is finished, fitting steps of the foreground profile may begin. Compared to background profile, estimation of the foreground profile is more sophisticated due to strong inter- and intraobject intensity variation. After segmentation C 2016 The Authors.Journal of Microscopy published by John Wiley & Sons Ltd on behalf of Royal Microscopical Society., 263,

4 FLAT FIELD CORRECTION 331 Fig. 1. Flow chart for profile estimation. and binning of each image, there are more stages of processing compared to background. Like it was done with background images, a number of images are processed together, in order to detect outlier images and discard them. Next, an average image is calculated out of the set. This is repeated with a new set of images several times. After this step, a number of averaged foreground images, each from a different set of original images, have been accumulated. Then, background subtraction is applied to each of these images. After this step, the images are fitted globally to determine the foreground profile. Furthermore, the images are divided into smaller sets which are fitted separately in order to estimate statistical errors like in the process of background profile estimation. Figure 3 illustrates some steps of foreground profile estimation. A fundamental assumption of our approach is that the normalized background (and foreground) profiles are practically identical for all images (wells and fields) in the data set as the optical device is assumed to be stable during the measurement. However, we estimate a separate set of profiles for each detection channel (colour) as the optical settings differ between channels. The necessity was confirmed during our studies, see Results section. Our approach may but need not involve a model for the shape of the background and foreground profiles. A polynomial model has been recommended in literature; usually the second-order polynomial is considered sufficiently adequate (Van den Doel et al., 1998; Likar et al., 2000; Young, 2000). We have however discovered many cases when the secondorder polynomial model is not adequate enough. In particular, with modern wide field instruments, profiles may be so deep that the fit curve if based on second-order polynomial crosses zero. We have applied either fourth-order polynomial model, or worked model-free, using image regularization. (In principle, other models than fourth-order polynomial, with a limited number of parameters, can also be applied, but we think that image regularization is an approach which is more C 2016 TheAuthors.Journal of Microscopy published by John Wiley & Sons Ltd on behalf of Royal Microscopical Society., 263,

5 332 P. KASK ET AL. Fig. 2. Illustration of intermediate data in background profile estimation. (A) An original image. (B) Detected background regions of the same image. (C) A downscaled image of background regions. (D) Average over 39 downscaled images of background regions. universal than any particular model). Problems due to overfitting are minimized by our built-in quality control keeping random errors under a limit. Background segmentation In this section, the detection of background regions is described in detail. A background region is a region without foreground objects; however pixels which physically belong to background (i.e. there are no objects in that place in the sample) but which are close to an intense foreground object, exhibit an intensity which is a bit increased compared to the characteristic background level. Including such regions would increase the variance of background intensity and thus random error in the subsequent linear fitting. Thus, only regions which are safely background regions are considered during profile estimation. As it has been expressed by Piccinini et al. (2012), detecting all the background pixels is not crucial; what is fundamental is that all the pixels definitely detected, except a negligible number, belong to the background. Of those methods we have tested for background segmentation we have found noise- segmentation to be the best approach. This selection is in accordance with Lee et al. (2014) who have applied a simple lower threshold to local intensity variation. Our approach in addition accounts for the dependence of expected local intensity variation on mean intensity (the mean background intensity is different in different locations of the image) as well as eliminates the contribution of the slope of the background profile to intensity variation while keeping only variation of random nature. A simple theoretical explanation of noise-based segmentation is as follows. If light detection was a pure digital process then the number of detected photons at constant illumination had Poisson distribution. Poisson distribution of variable n has a property var (n) = n. (4) Fig. 3. Illustration of intermediate data in foreground profile estimation. (A) An original image. (B) Detected foreground regions of the same image. (C) Downscaled image of B. (D) Average over 128 images like C, after subtraction of the background contribution. C 2016 The Authors.Journal of Microscopy published by John Wiley & Sons Ltd on behalf of Royal Microscopical Society., 263,

6 FLAT FIELD CORRECTION 333 Foreground segmentation Fig.4. Illustrationofnoiseimage.(A)Ameasuredimageofcells(size ). (B) Image of local noise; on this illustration the greyscale starts at 0.5 and ends at 2.0. Local noise is defined as local variance divided by local mean. Here, local statistics have been calculated using disk kernel of radius10pixels,butlocalstatisticsinblockscanbeusedaswell.histogram of intensities of the noise image has a peak with maximum at 1.09 and half-width of This peak corresponds to background regions. In this example, due to contributions from cell regions, mean of the noise image is as high as 11.8 and maximum is Even though the readout of a real multiarray detector is not directly the number of photons, its variance divided by mean at constant illumination is quite well a constant in a wide range of intensities. This constant can be estimated for each detection channel from a series of images containing some background regions. Mean and variance are measured in small regions (e.g. of 9 9 pixel size) where the intensity profile is nearly constant. But still it may have a significant slope. We have therefore found it necessary to subtract contribution of the slope of profile to intensity variance. (This indeed is a chickenand-egg problem.) We have used two ways to accomplish this. When segmenting the very first images, we have not yet estimated the profile (not even approximately) we assume that the profile is centred and calculate the noise level for a series of different curvatures of the profile. Such curvature is selected at which the noise level is minimal. Later when we already have an estimate of the profile, we use this knowledge for subtracting the contribution due to profile variation. Thus, we have estimated variation of detected intensities at each location without a significant contribution due to profile variation. When having calculated the ratio of local random variance to local mean intensity, the background regions are characterized by a result which is very well constant (independent of image coordinates, unlike the background intensity itself) whereas in foreground regions the outcome is nearly everywhere by far bigger than the constant characteristic to background. (This is so because foreground objects have nearly nowhere sufficiently big regions of constant intensity.) The ratio of local variance to local average intensity (called noise image) for a typical cell image is presented in Figure 4. We apply an upper threshold to the noise image and get a mask. This mask is still not sufficiently good since it may involve some high-intensity regions inside cells of occasionally constant intensity. Those regions are removed by an additional step using an intensity threshold. The result is a good quality background mask. As the resulting background mask does not contain all nonobject pixels, the foreground objects need to be segmented separately. By this, the foreground mask also does not inherit the segmentation errors from background segmentation. The most important requirements for foreground segmentation are as follows: (i) the segmentation must be simple, fast and robust; (ii) the segmentation must run without any parameter adjustment or any other kind of user intervention; (iii) the segmentation must not yield artefacts which would distort estimated profiles. The most difficult requirement is (iii). It is related to the fact that intensity of foreground objects varies a lot. It is in particular difficult to grant that in different image regions (in image centre region; at image border; at image corners) lower and higher intensities of foreground are represented in identical proportions. In order to minimize methodological distortions, we have selected the following approach. We divide the original image into a high number of blocks of identical size (we use a block size of or pixels, depending on size of the original image) and apply a simple segmentation algorithm in each of them. In each block we have calculated and applied a threshold which is the arithmetic average intensity in the block plus five times expected standard deviation calculated from the known noise constant. It is important that in each block, segmentation operations are applied without taking intensities in the neighbouring blocks into account. Values of the threshold in different blocks do not necessarily follow the intensity profile, but deviations from the ideal threshold for each block are random. Finally, we bin the foreground image so that each block is represented by a single pixel. All pixels of this final image at the image border or far from border are results of identical treatment. We have tested this segmentation algorithm using synthetic images with a known intensity profile and objects of varying intensity and found that distortions of the estimated profile are negligible. Profile estimation by polynomial fitting and regularization Fitting accumulated experimental data against a parametric model and image regularization are different operations but they have the same goal: from experimental data which may be rather noisy, a smooth function is determined. In fitting, the minimized function is the sum of squares of deviations between experimental data and a smooth function calculated from the model. If the model is polynomial then the fitting problem is linear, that is, each measured data point is a linear function of parameters of the model. A linear fitting problem finds a stable solution in a single iteration. Regularization is a less frequently used operation than fitting, therefore, we describe it in more detail. Formally, parameters of the model are intensities in all pixels of the binned C 2016 TheAuthors.Journal of Microscopy published by John Wiley & Sons Ltd on behalf of Royal Microscopical Society., 263,

7 334 P. KASK ET AL. Fig. 5. Illustration of background profiles for different channels. False colour images of background profiles were estimated from two different sets of wells of the same plate (upper images: wells from upper half of the plate; lower images: wells from the lower half of the plate) and for two detection channels (A and C: Draq5; B and D: Hoechst 33342). Each colour represents an intensity interval of 5%. Fig. 6. Comparison of background and foreground profiles. False colour images of background and foreground profiles were estimated for Hoechst channel from two different sets of wells of the same plate (upper images: wells from plate columns 16 19; lower images: wells from plate columns 20 23; A and C show the background profile; B and D the foreground profile). solution image, but the minimized function consists of two terms: deviation energy and regularization energy: F = 1 ( ) 2 ( Eij T ij + R 2 2 Lxxij + Lyy ) ij. (5) N i, j i, j where N is the number of pixels, E ij is the measured data (a noisy image), T ij is the solution (a smooth image), R is the regularization coefficient, and Lxx ij and Lyy ij are the second partial derivatives of T ij. The first term is the same as in fitting, but the second term is introduced to penalize nonsmooth solutions. A data set is usually an image of the size 17 16, since binning by a factor of about 64 is applied to original images when accumulating experimental data for profile estimation. Evaluation A relatively simple task of evaluation of an estimated profile is the estimation of the level of random errors, neglecting methodological errors. This task is accomplished routinely for each data set by dividing the data into a low number of independent realizations, estimating the profile for each of them and determining the random error as a measure of variation between independent estimates. However, there are also methodological errors, for example errors of the estimated profile due to using an inadequate mathematical model, or distortions of the estimated profile due to errors in segmentation. For comparing different mathematical models (e.g. the second- and the fourth-order polynomial model), we have used a statistic calculated as follows. We divide a big data set into subsets and study the subsets pairwise. One of the subsets is used for profile estimation and the other for verification. The calculated statistic is mean square deviation between the estimated profile and the verification data set. Naturally, both data are normalized. Thus, the statistic measures a mean distance between the fit curve of one data set and noisy data of the other data set. (In short, we call the statistic as estimation-to-verification distance.) A more appropriate model returns a smaller value of this statistic than a less appropriate model. Overfitting, by definition, pushes the value of the statistic up. For an overall evaluation of the quality of flat field correction which covers both random and various kinds of methodological errors, the method of overlapping fields is the best approach we have found. However, overlapping fields are not part of routine experiments; thus this tool cannot be used during real-life application of the correction method, but only in the stage of development, when comparing different approaches for flat field correction. In ideal, the two images of the same part of the sample cropped from two different image fields are identical. In reality, pixel intensities of the two images are never identical, not even after a very sophisticated flat field correction. There are random errors of detected intensities, but usually there are significantly bigger intensity variations of other kind, for example, due to tiny variations in focal height, as well as in physical x- and y-coordinates of each pixel. There are differences between a pair of images of the same part that remain unnoticed when looking at them as separate images but which become apparent after a difference image has been calculated. We have modified the approach of overlapping fields by the following steps. First, we study the quality of match between intensities of two images separately in background and foreground regions. Second, we normalize the pixelwise difference image by dividing it pixelwise by the average of C 2016 The Authors.Journal of Microscopy published by John Wiley & Sons Ltd on behalf of Royal Microscopical Society., 263,

8 FLAT FIELD CORRECTION 335 Table 1. Estimation-to-verification distance of background profiles in 14 test experiments. Experiment Estimation-to-verification distance Confocal/Nonconfocal Magnification Channel Second polynomial Fourth polynomial Regularization (at coefficient 0.2) Nonconfocal 10 Hoechst Nonconfocal 10 Alexa Nonconfocal 20 DRAQ Nonconfocal 20 Hoechst Nonconfocal 20 Alexa Nonconfocal 60 DRAQ Nonconfocal 60 Hoechst Nonconfocal 60 Alexa Confocal 20 DRAQ Confocal 20 Hoechst Confocal 20 Alexa Confocal 60 DRAQ Confocal 60 Hoechst Confocal 60 Alexa the two images. The outcome is numerically more informative and less noisy than the absolute difference image. Next, we calculate the pixelwise average over hundreds of normalized difference images; by this, fluctuations of random origin are averaged out. Illustrations of the method of overlapping fields with above-mentioned additional steps can be found in the Results section. Results Reproducibility of estimated profiles We have estimated profiles from different regions (wells) of the same plate, with the task to verify if the estimated profiles are identical (within random error) or if one can discover a significant shift, for example, from one end of the plate to the other. At the same time, we have estimated profiles for different detection channels (i.e. colours). As illustrated in Figure 5, background profiles estimated for different channels are different, but background profiles of a given channel estimated from different regions of the plate are practically identical. As illustrated in Figure 6, background and foreground profiles of the same channel are significantly different whereas they both are well reproduced from different regions of the plate. Selection of the profile model We have selected 14 data sets representing different colours ( Draq5, excitation 640 nm, emission nm; Hoechst 33342, excitation 375 nm, emission nm; Alexa 488, excitation 490 nm, emission nm), and confocal and nonconfocal acquisition modes. In nearly all cases both for background and foreground the fourth-order poly- Estimation to Verification Distance Regularization Coefficient Fig. 7. Optimization of the value of the regularization coefficient for estimation of the background profile in 14 different cases. Estimation-toverification distance denotes average over many pairs of realization of standard deviation of the difference of the estimated profile in one realization and experimental data in the other realization. The best solution has the lowest value of the estimation-to-verification distance, as explained in the Materials and methods section. nomial exhibited a shorter estimation-to-verification distance than the second-order polynomial. See Table 1 for background data. The fourth-order polynomial and regularization (using 0.2 as regularization coefficient) have had more or less equal performance; only in rare cases one of them has been noticeably better than the other. Optimization of the regularization coefficient The optimal value of the regularization coefficient tends to be bigger for data sets with bigger noise (see Fig. 7 for C 2016 TheAuthors.Journal of Microscopy published by John Wiley & Sons Ltd on behalf of Royal Microscopical Society., 263,

9 336 P. KASK ET AL. Random Error Background Foreground be characterized by the range of variation of the estimated profile. Usually, this is anywhere between 10% and 50% depending on the objective, optical mode (confocal/nonconfocal) and detection channel. Non flatness after correction can be approximated by the random error of the estimate of the background (or foreground) profile, when neglecting methodological errors. In typical cases, non flatness of the background is reduced by a factor bigger than 10. For performance characteristics of estimation of background and foreground profiles, see Table Number of Images Fig. 8. Influence of the number of images on the estimated random error. Estimated random errors of estimated background (red dots) and foreground (blue diamonds) profiles are plotted versus the number of images used. The shown pair of graphs characterizes a single set of measured images. background data). Optimization of the value of the regularization coefficient is time consuming and thus preferably avoided in routine operation. As the optimal value varies only slightly between data sets and in addition the goodness of the solution only weakly depends on its exact value (see Fig. 7), we have fixed the value to 0.2 for the background and 20.0 for the foreground in the rest of this study. Accuracy of profile estimation The random error of the estimated profiles strongly depends on the number of images available for profile estimation. Figure 8 shows random error of estimates for both background and foreground profiles for a typical data set, versus the number of images used for estimating the profile. The range of variation (defined as maximum minus minimum) of the estimated background profile is 28.8% and that of the foreground profile is 21.7%. (Remember that we use the convention that mean of each profile is one.) What stands out is a big difference in numeric values of random errors between background and foreground profiles. This is a general phenomenon, with exceptions only in extremely confluent cases (cells covering most of the field of view) when there is not sufficient background area for profile estimation. Otherwise, the background profile is typically estimated with better than 2% accuracy already from 10 images whereas hundreds of images are needed to bring the random error of the foreground profile below 4%. The number of images required for a robust estimate of the foreground profile strongly depends on the variation of interand intraobject intensities and the confluency. In the following we will describe non flatness by the characteristics of the estimated profiles. Non flatness of the background (or foreground) image before correction can Evaluation of the quality of flat field correction by visual inspection The result of shading correction for background is convincing already by visual inspection of images before and after correction. Contrary to the background, it is very difficult to judge flatness of the foreground profile by visual inspection alone. Only in exceptional cases one can visually notice the effect of foreground correction on a single image. One such case is an exceptionally deep foreground profile. For illustration purposes we have used a special optical arrangement creating an unusually deep intensity profile of excitation and prepared the sample images in colour-coded mode (see Fig. 9). The range of variation of the estimated background profile is 49.3% (random error 1.7%) and that of the foreground profile is even 109.5% (random error 2.9%). When applying only background correction (Fig. 9B), the brightest (red) objects still tend to be concentrated around the centre of the image field. After full correction (Fig. 9C), bright cells can be found anywhere in the image field. Evaluation using the method of overlapping fields We have run special experiments with pairs of image fields which overlap by 45% in each (x and y) direction. If the second image is shifted right and down then the same part of the sample is seen at the right lower corner of the first image and at the left upper corner of the second image. In Figures 10(A) and (B), the same part of the sample is presented, extracted from two image fields. The fact that they are not identical becomes apparent when plotting a difference image, see Figure 10(C). Here we have plotted data from background regions only; foreground regions have been studied separately. Figure 10(D) differs from Figure 10(C) only by having corrected the images against flat field profiles before calculating the difference image. The effect of correction is clearly visible on the difference image calculated from a single pair of images, but the effect is even more impressive when we average difference images from many pairs of images, thus reducing noise of random origin, see Figures 10(E) and (F). Judging by standard deviation of intensities of Figures 10(E) and (F) (11.8% and 1.65%), non flatness in background regions has been reduced by a factor of 7.1 in this example. C 2016 The Authors.Journal of Microscopy published by John Wiley & Sons Ltd on behalf of Royal Microscopical Society., 263,

10 FLAT FIELD CORRECTION 337 Fig. 9. Effect of partial (background only) and full flat field correction on image visualization and quantification. (A) Uncorrected image with Hoechst stained nuclei in false colour image display. (B) Corrected data set against background profile. (C) Fully corrected data set. Fig. 10. The method of overlapping fields applied to background regions. (A), (B) The same part of a sample extracted from two image fields which partially overlap. (C) Colour-coded normalized difference image of A and B. (D) Same as C but the pair of images has been corrected for flat field profiles prior to calculation of the difference image. (E), (F) The same as C and D but averaged over many (1920) difference images each extracted from a pair of similarly overlapping image fields. Fig. 11. The method of overlapping fields applied to foreground regions. (A), (B) The same part of a sample extracted from two image fields which partially overlap. (C) Colour-coded normalized difference image of A and B. (D) Same as C but the pair of images has been corrected for flat field profiles prior to calculation of the difference image. (E), (F) The same as C and D but averaged over many (1280) difference images each extracted from a pair of similarly overlapping image fields. We have done exactly the same for foreground regions, see Figure 11. The normalized difference image in foreground regions shows some artefacts that are most likely due to tiny variations in all x, y and z physical coordinates of each pixel of the two extracted images. However, when averaging over very many difference images, the artefacts disappear, as apparent in Figures 11(E) and (F). Standard deviation of images shown in Figures 11(E) and (F) are 4.2% and 0.65%, respectively, that is, the flatness quality in foreground has improved by a factor of 6.5 in this example A remarkable detail of this test experiment is that the estimated background profile is more than twice deeper than the foreground profile. If the background profile was applied to foreground as well, this would result in overcorrection, with loss rather than gain in flat field quality. Performance of the outlier filter The stability of the results of profile estimation is, to a great extent, achieved by the built-in outlier filter, which detects images with artefacts which would impair the quality of the derived profiles. An example is given in Figure 12. The exclusion of the full image instead of just the artefact object itself might seem quite rigorous, but, for example, very shiny C 2016 TheAuthors.Journal of Microscopy published by John Wiley & Sons Ltd on behalf of Royal Microscopical Society., 263,

11 338 P. KASK ET AL. Table 2. Performance of estimation of background and foreground profiles in 14 different cases. Confocal/ Image count for Image count for Range of variation Range of variation Random error of Random error of Duration of nonconfocal Magnification Channel background profile foreground profile of background profile of foreground profile background profile foreground profile estimation, (s) Nonconfocal 10 Hoechst Nonconfocal 10 Alexa Nonconfocal 20 Hoechst Nonconfocal 20 Alexa Nonconfocal 20 DRAQ Nonconfocal 60 Hoechst Nonconfocal 60 Alexa Nonconfocal 60 DRAQ Confocal 20 Hoechst Confocal 20 Alexa Confocal 20 DRAQ Confocal 60 Hoechst Confocal 60 DRAQ Confocal 60 Alexa Fig. 12. Illustration of the performance of the outlier filter. There are eight images in the set. The foreground outlier filter has detected that the last two images (G, H) are potentially containing artefacts and discarded them from profile estimation. In fact, we prefer to use 16 rather than 8 images in the input of the outlier filter, just to improve statistical performance of filtering. artefacts tend to alter the surrounding as well, which would otherwise remain in the data considered for profile estimation. Discussion and conclusion There are multiple approaches to correct for vignetting. In microscopy, however, those still prevail which assume that there is only one intensity profile. They are either based on the acquisition of a calibration image from an empty field or a reference slide (Model & Burkhardt, 2001; Zwier et al., 2004). Or more automatic methods estimate the intensity profile by combining background regions from a number of regular sample images (Gherardi et al., 2011; Piccinini et al., 2012) or by averaging and smoothing sample images (Leong et al., 2003; Jones et al., 2006). But why are single profile methods still widely used and the poor quality of flat field correction remains apparently unnoticed although segmentation quality and variation in C 2016 The Authors.Journal of Microscopy published by John Wiley & Sons Ltd on behalf of Royal Microscopical Society., 263,

12 FLAT FIELD CORRECTION 339 calculated characteristics of cells may become worse rather than improved if inadequate flat field correction is applied? The main reason is that the effect of background correction is visually most striking. Furthermore, it generally improves the quality of image analysis like segmentation significantly. Also for dim samples background correction alone already leads to a considerable levelling of object intensities. In contrast, the positive effect of foreground correction is hardly noticeable by visual inspection. If, however, absolute intensity thresholds, for example, for object classification or intensity ratios of different measurement channels are used, the results are strongly impaired if no foreground correction has been applied and the intensity profiles of the related channels do not match. In this work we present evidence that it is mandatory to estimate background and foreground profiles for each detection channel to achieve a good flat field correction. A single estimated intensity profile is insufficient for this purpose. We have overcome the difficulty to demonstrate the effect and quality of the foreground correction by applying a recently introduced method of overlapping fields (Smith et al., 2015). To our impression, this is a very powerful method in particular with the modifications described in the Materials and methods section above. We have developed a fully automatic procedure which determines the background and foreground profiles directly from the sample images instead of calibration samples. This is beneficial as no time-consuming calibration step is required (being undesirable especially in high-content screening) and the resulting profiles always reflect the current optical adjustment of the imaging instrument used. Most importantly however, the commonly used calibration samples for flat field correction (usually fluorescent dye solutions) do not adequately reflect the characteristics of the foreground profile. An automatic procedure for profile estimation demands the algorithm to be very robust, any impairment of the images must be prevented by all means. In particular, a robust segmentation is crucial. We have therefore developed segmentation algorithms tailored for this application. In addition, we established an inherent quality control, one based on detecting artefacts on single sample images and the other one comparing the results from parallel estimations of the intensity profile and removing outlier profiles. The latter has worked most reliably in all data sets we have tested so far. In worst case if the random error of the final estimation step is still too high, the correction profile is not applied so any deterioration of the images can be ruled out. But in our experience this only occurs if the majority of images show artefacts like well borders. Our algorithm can thus run completely unsupervised for a wide range of biological samples. Also no assumption about uniform distribution of objects in the field of view has been made. Furthermore, no a priori knowledge about the shape of the intensity profile is required. Our method can work modelfree, just using image regularization. Nevertheless, also a mathematical model of a profile (e.g. the fourth-order polynomial) can be incorporated, which makes the procedure more efficient. With this level of sophistication of the algorithms the computing time quite naturally is increased. Approximately 75% of the calculation time is spent on segmentation. Nevertheless, in most experiments the calculation time for estimation of intensity profiles is not noticeable at all. We collect the data for profile estimation in parallel to the measurement and usually, profile estimation is completed before all wells of a plate have been measured. A limiting aspect of our approach is the requirement of hundreds of images for an adequate estimation of a foreground profile. This is not a problem in most high-throughput screening applications, but it is hardly acceptable in many other branches of research. Methods like CIDRE (Smith et al., 2015) could potentially be used to lower the number of required images and make the method applicable in other branches of research which do not inherently work with a high number of images like high-throughput screening. Acknowledgements We would like to thank Angelika Foitzik and Katharina Rehs for selecting and preparing measurements for algorithm development, Elmar Thews for fine adjustment of the equipment and Sebastian Bösch for running measurements with overlapping fields. We are also grateful for the fruitful discussions we had with the PerkinElmer team, especially Matthias Fassler and Hartwig Preckel. References Bevilacqua, A., Piccinini, F. & Gherardi, A. (2011) Vignetting correction by exploiting an optical microscopy image sequence. In Proceedings of the Thirty-third International Conference Engineering in Medicine and Biology Society. pp IEEE, Boston, MA, U.S.A. Gareau, D.S., Patel, Y.G., Li, Y., Aranda, I., Halpern, A.C., Nehal, K.S. & Rajadhyaksha, M. (2009) Confocal mosaicing microscopy in skin excisions: a demonstration of rapid surgical pathology. J. Microsc. 233, Gherardi, A., Bevilacqua, A. & Piccinini, F. (2011) Illumination field estimation through background detection in optical microscopy. In Proceedings of the 2011 IEEE Symposium on Computational Intelligence in Bioinformatics and Computational Biology (CIBCB). pp IEEE, Paris, France. Goldman, D.B. & Chen, J.H. (2005) Vignette and exposure calibration and compensation. In Proceedings of the Tenth IEEE International Conference on Computer Vision, ICCV Vol.1, pp IEEE, Beijing, China. Jones, T.R., Carpenter, A.E., Sabatini, D.M. & Golland, P. (2006) Methods for high-content, high-throughput image-based cell screening. In Proceedings of the Workshop on Microscopic Image Analysis with Applications in Biology.Vol.5, pp , Heidelberg, Germany. Lee, S.S., Pelet, S., Peter, M. & Dechant, R. (2014) A rapid and effective vignetting correction for quantitative microscopy. RCS Adv. 4, C 2016 TheAuthors.Journal of Microscopy published by John Wiley & Sons Ltd on behalf of Royal Microscopical Society., 263,

Pixel Response Effects on CCD Camera Gain Calibration

Pixel Response Effects on CCD Camera Gain Calibration 1 of 7 1/21/2014 3:03 PM HO M E P R O D UC T S B R IE F S T E C H NO T E S S UP P O RT P UR C HA S E NE W S W E B T O O L S INF O C O NTA C T Pixel Response Effects on CCD Camera Gain Calibration Copyright

More information

NON UNIFORM BACKGROUND REMOVAL FOR PARTICLE ANALYSIS BASED ON MORPHOLOGICAL STRUCTURING ELEMENT:

NON UNIFORM BACKGROUND REMOVAL FOR PARTICLE ANALYSIS BASED ON MORPHOLOGICAL STRUCTURING ELEMENT: IJCE January-June 2012, Volume 4, Number 1 pp. 59 67 NON UNIFORM BACKGROUND REMOVAL FOR PARTICLE ANALYSIS BASED ON MORPHOLOGICAL STRUCTURING ELEMENT: A COMPARATIVE STUDY Prabhdeep Singh1 & A. K. Garg2

More information

Application Note (A11)

Application Note (A11) Application Note (A11) Slit and Aperture Selection in Spectroradiometry REVISION: C August 2013 Gooch & Housego 4632 36 th Street, Orlando, FL 32811 Tel: 1 407 422 3171 Fax: 1 407 648 5412 Email: sales@goochandhousego.com

More information

Illumination Correction tutorial

Illumination Correction tutorial Illumination Correction tutorial I. Introduction The Correct Illumination Calculate and Correct Illumination Apply modules are intended to compensate for the non uniformities in illumination often present

More information

Speed and Image Brightness uniformity of telecentric lenses

Speed and Image Brightness uniformity of telecentric lenses Specialist Article Published by: elektronikpraxis.de Issue: 11 / 2013 Speed and Image Brightness uniformity of telecentric lenses Author: Dr.-Ing. Claudia Brückner, Optics Developer, Vision & Control GmbH

More information

CHAPTER-4 FRUIT QUALITY GRADATION USING SHAPE, SIZE AND DEFECT ATTRIBUTES

CHAPTER-4 FRUIT QUALITY GRADATION USING SHAPE, SIZE AND DEFECT ATTRIBUTES CHAPTER-4 FRUIT QUALITY GRADATION USING SHAPE, SIZE AND DEFECT ATTRIBUTES In addition to colour based estimation of apple quality, various models have been suggested to estimate external attribute based

More information

WFC3 TV3 Testing: IR Channel Nonlinearity Correction

WFC3 TV3 Testing: IR Channel Nonlinearity Correction Instrument Science Report WFC3 2008-39 WFC3 TV3 Testing: IR Channel Nonlinearity Correction B. Hilbert 2 June 2009 ABSTRACT Using data taken during WFC3's Thermal Vacuum 3 (TV3) testing campaign, we have

More information

Application Note (A13)

Application Note (A13) Application Note (A13) Fast NVIS Measurements Revision: A February 1997 Gooch & Housego 4632 36 th Street, Orlando, FL 32811 Tel: 1 407 422 3171 Fax: 1 407 648 5412 Email: sales@goochandhousego.com In

More information

DIGITAL IMAGE PROCESSING Quiz exercises preparation for the midterm exam

DIGITAL IMAGE PROCESSING Quiz exercises preparation for the midterm exam DIGITAL IMAGE PROCESSING Quiz exercises preparation for the midterm exam In the following set of questions, there are, possibly, multiple correct answers (1, 2, 3 or 4). Mark the answers you consider correct.

More information

On spatial resolution

On spatial resolution On spatial resolution Introduction How is spatial resolution defined? There are two main approaches in defining local spatial resolution. One method follows distinction criteria of pointlike objects (i.e.

More information

Image Capture TOTALLAB

Image Capture TOTALLAB 1 Introduction In order for image analysis to be performed on a gel or Western blot, it must first be converted into digital data. Good image capture is critical to guarantee optimal performance of automated

More information

CCD reductions techniques

CCD reductions techniques CCD reductions techniques Origin of noise Noise: whatever phenomena that increase the uncertainty or error of a signal Origin of noises: 1. Poisson fluctuation in counting photons (shot noise) 2. Pixel-pixel

More information

Reference Free Image Quality Evaluation

Reference Free Image Quality Evaluation Reference Free Image Quality Evaluation for Photos and Digital Film Restoration Majed CHAMBAH Université de Reims Champagne-Ardenne, France 1 Overview Introduction Defects affecting films and Digital film

More information

Multifluorescence The Crosstalk Problem and Its Solution

Multifluorescence The Crosstalk Problem and Its Solution Multifluorescence The Crosstalk Problem and Its Solution If a specimen is labeled with more than one fluorochrome, each image channel should only show the emission signal of one of them. If, in a specimen

More information

Opto Engineering S.r.l.

Opto Engineering S.r.l. TUTORIAL #1 Telecentric Lenses: basic information and working principles On line dimensional control is one of the most challenging and difficult applications of vision systems. On the other hand, besides

More information

Supplementary Materials for

Supplementary Materials for advances.sciencemag.org/cgi/content/full/1/11/e1501057/dc1 Supplementary Materials for Earthquake detection through computationally efficient similarity search The PDF file includes: Clara E. Yoon, Ossian

More information

Parameter Selection and Spectral Optimization Using the RamanStation 400

Parameter Selection and Spectral Optimization Using the RamanStation 400 Parameter Selection and Spectral Optimization Using the RamanStation 400 RAMAN SPECTROSCOPY A P P L I C A T I O N N O T E In modern dispersive Raman spectroscopy, good quality spectra can be obtained from

More information

Spectral Analysis of the LUND/DMI Earthshine Telescope and Filters

Spectral Analysis of the LUND/DMI Earthshine Telescope and Filters Spectral Analysis of the LUND/DMI Earthshine Telescope and Filters 12 August 2011-08-12 Ahmad Darudi & Rodrigo Badínez A1 1. Spectral Analysis of the telescope and Filters This section reports the characterization

More information

product overview pco.edge family the most versatile scmos camera portfolio on the market pioneer in scmos image sensor technology

product overview pco.edge family the most versatile scmos camera portfolio on the market pioneer in scmos image sensor technology product overview family the most versatile scmos camera portfolio on the market pioneer in scmos image sensor technology scmos knowledge base scmos General Information PCO scmos cameras are a breakthrough

More information

Performance Factors. Technical Assistance. Fundamental Optics

Performance Factors.   Technical Assistance. Fundamental Optics Performance Factors After paraxial formulas have been used to select values for component focal length(s) and diameter(s), the final step is to select actual lenses. As in any engineering problem, this

More information

License Plate Localisation based on Morphological Operations

License Plate Localisation based on Morphological Operations License Plate Localisation based on Morphological Operations Xiaojun Zhai, Faycal Benssali and Soodamani Ramalingam School of Engineering & Technology University of Hertfordshire, UH Hatfield, UK Abstract

More information

Texture characterization in DIRSIG

Texture characterization in DIRSIG Rochester Institute of Technology RIT Scholar Works Theses Thesis/Dissertation Collections 2001 Texture characterization in DIRSIG Christy Burtner Follow this and additional works at: http://scholarworks.rit.edu/theses

More information

CHAPTER 8: EXTENDED TETRACHORD CLASSIFICATION

CHAPTER 8: EXTENDED TETRACHORD CLASSIFICATION CHAPTER 8: EXTENDED TETRACHORD CLASSIFICATION Chapter 7 introduced the notion of strange circles: using various circles of musical intervals as equivalence classes to which input pitch-classes are assigned.

More information

Luminescent Background Sources and Corrections

Luminescent Background Sources and Corrections Concept Tech Note 1 Luminescent Background Sources and Corrections The background sources of light from luminescent images are inherently very low. This appendix discusses sources of background and how

More information

Game Mechanics Minesweeper is a game in which the player must correctly deduce the positions of

Game Mechanics Minesweeper is a game in which the player must correctly deduce the positions of Table of Contents Game Mechanics...2 Game Play...3 Game Strategy...4 Truth...4 Contrapositive... 5 Exhaustion...6 Burnout...8 Game Difficulty... 10 Experiment One... 12 Experiment Two...14 Experiment Three...16

More information

Specifications for a successful analysis. Standard Tool. 1 Description of WimTube... 2

Specifications for a successful analysis. Standard Tool. 1 Description of WimTube... 2 Specifications for a successful analysis Standard Tool 1 Description of WimTube... 2 2 Specifications for the input files... 2 2.1 Description... 2 2.2 Valid Formats... 2 2.3 Microscopy Techniques... 2

More information

Application Note #548 AcuityXR Technology Significantly Enhances Lateral Resolution of White-Light Optical Profilers

Application Note #548 AcuityXR Technology Significantly Enhances Lateral Resolution of White-Light Optical Profilers Application Note #548 AcuityXR Technology Significantly Enhances Lateral Resolution of White-Light Optical Profilers ContourGT with AcuityXR TM capability White light interferometry is firmly established

More information

JOHANN CATTY CETIM, 52 Avenue Félix Louat, Senlis Cedex, France. What is the effect of operating conditions on the result of the testing?

JOHANN CATTY CETIM, 52 Avenue Félix Louat, Senlis Cedex, France. What is the effect of operating conditions on the result of the testing? ACOUSTIC EMISSION TESTING - DEFINING A NEW STANDARD OF ACOUSTIC EMISSION TESTING FOR PRESSURE VESSELS Part 2: Performance analysis of different configurations of real case testing and recommendations for

More information

IMAGE PROCESSING TECHNIQUES FOR CROWD DENSITY ESTIMATION USING A REFERENCE IMAGE

IMAGE PROCESSING TECHNIQUES FOR CROWD DENSITY ESTIMATION USING A REFERENCE IMAGE Second Asian Conference on Computer Vision (ACCV9), Singapore, -8 December, Vol. III, pp. 6-1 (invited) IMAGE PROCESSING TECHNIQUES FOR CROWD DENSITY ESTIMATION USING A REFERENCE IMAGE Jia Hong Yin, Sergio

More information

A Study of Slanted-Edge MTF Stability and Repeatability

A Study of Slanted-Edge MTF Stability and Repeatability A Study of Slanted-Edge MTF Stability and Repeatability Jackson K.M. Roland Imatest LLC, 2995 Wilderness Place Suite 103, Boulder, CO, USA ABSTRACT The slanted-edge method of measuring the spatial frequency

More information

Local Oscillator Phase Noise and its effect on Receiver Performance C. John Grebenkemper

Local Oscillator Phase Noise and its effect on Receiver Performance C. John Grebenkemper Watkins-Johnson Company Tech-notes Copyright 1981 Watkins-Johnson Company Vol. 8 No. 6 November/December 1981 Local Oscillator Phase Noise and its effect on Receiver Performance C. John Grebenkemper All

More information

Figure 1 HDR image fusion example

Figure 1 HDR image fusion example TN-0903 Date: 10/06/09 Using image fusion to capture high-dynamic range (hdr) scenes High dynamic range (HDR) refers to the ability to distinguish details in scenes containing both very bright and relatively

More information

A Kalman-Filtering Approach to High Dynamic Range Imaging for Measurement Applications

A Kalman-Filtering Approach to High Dynamic Range Imaging for Measurement Applications A Kalman-Filtering Approach to High Dynamic Range Imaging for Measurement Applications IEEE Transactions on Image Processing, Vol. 21, No. 2, 2012 Eric Dedrick and Daniel Lau, Presented by Ran Shu School

More information

Impulse noise features for automatic selection of noise cleaning filter

Impulse noise features for automatic selection of noise cleaning filter Impulse noise features for automatic selection of noise cleaning filter Odej Kao Department of Computer Science Technical University of Clausthal Julius-Albert-Strasse 37 Clausthal-Zellerfeld, Germany

More information

Colour Profiling Using Multiple Colour Spaces

Colour Profiling Using Multiple Colour Spaces Colour Profiling Using Multiple Colour Spaces Nicola Duffy and Gerard Lacey Computer Vision and Robotics Group, Trinity College, Dublin.Ireland duffynn@cs.tcd.ie Abstract This paper presents an original

More information

The Noise about Noise

The Noise about Noise The Noise about Noise I have found that few topics in astrophotography cause as much confusion as noise and proper exposure. In this column I will attempt to present some of the theory that goes into determining

More information

WHITE PAPER. Methods for Measuring Flat Panel Display Defects and Mura as Correlated to Human Visual Perception

WHITE PAPER. Methods for Measuring Flat Panel Display Defects and Mura as Correlated to Human Visual Perception Methods for Measuring Flat Panel Display Defects and Mura as Correlated to Human Visual Perception Methods for Measuring Flat Panel Display Defects and Mura as Correlated to Human Visual Perception Abstract

More information

SEAMS DUE TO MULTIPLE OUTPUT CCDS

SEAMS DUE TO MULTIPLE OUTPUT CCDS Seam Correction for Sensors with Multiple Outputs Introduction Image sensor manufacturers are continually working to meet their customers demands for ever-higher frame rates in their cameras. To meet this

More information

REAL-TIME X-RAY IMAGE PROCESSING; TECHNIQUES FOR SENSITIVITY

REAL-TIME X-RAY IMAGE PROCESSING; TECHNIQUES FOR SENSITIVITY REAL-TIME X-RAY IMAGE PROCESSING; TECHNIQUES FOR SENSITIVITY IMPROVEMENT USING LOW-COST EQUIPMENT R.M. Wallingford and J.N. Gray Center for Aviation Systems Reliability Iowa State University Ames,IA 50011

More information

Image Enhancement using Histogram Equalization and Spatial Filtering

Image Enhancement using Histogram Equalization and Spatial Filtering Image Enhancement using Histogram Equalization and Spatial Filtering Fari Muhammad Abubakar 1 1 Department of Electronics Engineering Tianjin University of Technology and Education (TUTE) Tianjin, P.R.

More information

SECTION I - CHAPTER 2 DIGITAL IMAGING PROCESSING CONCEPTS

SECTION I - CHAPTER 2 DIGITAL IMAGING PROCESSING CONCEPTS RADT 3463 - COMPUTERIZED IMAGING Section I: Chapter 2 RADT 3463 Computerized Imaging 1 SECTION I - CHAPTER 2 DIGITAL IMAGING PROCESSING CONCEPTS RADT 3463 COMPUTERIZED IMAGING Section I: Chapter 2 RADT

More information

Single Image Haze Removal with Improved Atmospheric Light Estimation

Single Image Haze Removal with Improved Atmospheric Light Estimation Journal of Physics: Conference Series PAPER OPEN ACCESS Single Image Haze Removal with Improved Atmospheric Light Estimation To cite this article: Yincui Xu and Shouyi Yang 218 J. Phys.: Conf. Ser. 198

More information

IncuCyte ZOOM Fluorescent Processing Overview

IncuCyte ZOOM Fluorescent Processing Overview IncuCyte ZOOM Fluorescent Processing Overview The IncuCyte ZOOM offers users the ability to acquire HD phase as well as dual wavelength fluorescent images of living cells producing multiplexed data that

More information

Examination, TEN1, in courses SK2500/SK2501, Physics of Biomedical Microscopy,

Examination, TEN1, in courses SK2500/SK2501, Physics of Biomedical Microscopy, KTH Applied Physics Examination, TEN1, in courses SK2500/SK2501, Physics of Biomedical Microscopy, 2009-06-05, 8-13, FB51 Allowed aids: Compendium Imaging Physics (handed out) Compendium Light Microscopy

More information

Image Recognition for PCB Soldering Platform Controlled by Embedded Microchip Based on Hopfield Neural Network

Image Recognition for PCB Soldering Platform Controlled by Embedded Microchip Based on Hopfield Neural Network 436 JOURNAL OF COMPUTERS, VOL. 5, NO. 9, SEPTEMBER Image Recognition for PCB Soldering Platform Controlled by Embedded Microchip Based on Hopfield Neural Network Chung-Chi Wu Department of Electrical Engineering,

More information

Fast Inverse Halftoning

Fast Inverse Halftoning Fast Inverse Halftoning Zachi Karni, Daniel Freedman, Doron Shaked HP Laboratories HPL-2-52 Keyword(s): inverse halftoning Abstract: Printers use halftoning to render printed pages. This process is useful

More information

Problem Set I. Problem 1 Quantization. First, let us concentrate on the illustrious Lena: Page 1 of 14. Problem 1A - Quantized Lena Image

Problem Set I. Problem 1 Quantization. First, let us concentrate on the illustrious Lena: Page 1 of 14. Problem 1A - Quantized Lena Image Problem Set I First, let us concentrate on the illustrious Lena: Problem 1 Quantization Problem 1A - Original Lena Image Problem 1A - Quantized Lena Image Problem 1B - Dithered Lena Image Problem 1B -

More information

FLAT FIELD DETERMINATIONS USING AN ISOLATED POINT SOURCE

FLAT FIELD DETERMINATIONS USING AN ISOLATED POINT SOURCE Instrument Science Report ACS 2015-07 FLAT FIELD DETERMINATIONS USING AN ISOLATED POINT SOURCE R. C. Bohlin and Norman Grogin 2015 August ABSTRACT The traditional method of measuring ACS flat fields (FF)

More information

PRACTICAL ASPECTS OF ACOUSTIC EMISSION SOURCE LOCATION BY A WAVELET TRANSFORM

PRACTICAL ASPECTS OF ACOUSTIC EMISSION SOURCE LOCATION BY A WAVELET TRANSFORM PRACTICAL ASPECTS OF ACOUSTIC EMISSION SOURCE LOCATION BY A WAVELET TRANSFORM Abstract M. A. HAMSTAD 1,2, K. S. DOWNS 3 and A. O GALLAGHER 1 1 National Institute of Standards and Technology, Materials

More information

Grid Assembly. User guide. A plugin developed for microscopy non-overlapping images stitching, for the public-domain image analysis package ImageJ

Grid Assembly. User guide. A plugin developed for microscopy non-overlapping images stitching, for the public-domain image analysis package ImageJ BIOIMAGING AND OPTIC PLATFORM Grid Assembly A plugin developed for microscopy non-overlapping images stitching, for the public-domain image analysis package ImageJ User guide March 2008 Introduction In

More information

Low Spatial Frequency Noise Reduction with Applications to Light Field Moment Imaging

Low Spatial Frequency Noise Reduction with Applications to Light Field Moment Imaging Low Spatial Frequency Noise Reduction with Applications to Light Field Moment Imaging Christopher Madsen Stanford University cmadsen@stanford.edu Abstract This project involves the implementation of multiple

More information

QUANTITATIVE IMAGE TREATMENT FOR PDI-TYPE QUALIFICATION OF VT INSPECTIONS

QUANTITATIVE IMAGE TREATMENT FOR PDI-TYPE QUALIFICATION OF VT INSPECTIONS QUANTITATIVE IMAGE TREATMENT FOR PDI-TYPE QUALIFICATION OF VT INSPECTIONS Matthieu TAGLIONE, Yannick CAULIER AREVA NDE-Solutions France, Intercontrôle Televisual inspections (VT) lie within a technological

More information

Digital Camera Technologies for Scientific Bio-Imaging. Part 2: Sampling and Signal

Digital Camera Technologies for Scientific Bio-Imaging. Part 2: Sampling and Signal Digital Camera Technologies for Scientific Bio-Imaging. Part 2: Sampling and Signal Yashvinder Sabharwal, 1 James Joubert 2 and Deepak Sharma 2 1. Solexis Advisors LLC, Austin, TX, USA 2. Photometrics

More information

Lecture - 06 Large Scale Propagation Models Path Loss

Lecture - 06 Large Scale Propagation Models Path Loss Fundamentals of MIMO Wireless Communication Prof. Suvra Sekhar Das Department of Electronics and Communication Engineering Indian Institute of Technology, Kharagpur Lecture - 06 Large Scale Propagation

More information

Temperature Dependent Dark Reference Files: Linear Dark and Amplifier Glow Components

Temperature Dependent Dark Reference Files: Linear Dark and Amplifier Glow Components Instrument Science Report NICMOS 2009-002 Temperature Dependent Dark Reference Files: Linear Dark and Amplifier Glow Components Tomas Dahlen, Elizabeth Barker, Eddie Bergeron, Denise Smith July 01, 2009

More information

RELIABILITY OF GUIDED WAVE ULTRASONIC TESTING. Dr. Mark EVANS and Dr. Thomas VOGT Guided Ultrasonics Ltd. Nottingham, UK

RELIABILITY OF GUIDED WAVE ULTRASONIC TESTING. Dr. Mark EVANS and Dr. Thomas VOGT Guided Ultrasonics Ltd. Nottingham, UK RELIABILITY OF GUIDED WAVE ULTRASONIC TESTING Dr. Mark EVANS and Dr. Thomas VOGT Guided Ultrasonics Ltd. Nottingham, UK The Guided wave testing method (GW) is increasingly being used worldwide to test

More information

BACKGROUND SEGMENTATION IN MICROSCOPY IMAGES

BACKGROUND SEGMENTATION IN MICROSCOPY IMAGES BACKGROUND SEGMENTATION IN MICROSCOPY IMAGES J.J. Charles, L.I. Kuncheva School of Computer Science, University of Wales, Bangor, LL57 1UT, United Kingdom jjc@informatics.bangor.ac.uk B. Wells Conwy Valley

More information

SYSTEMATIC NOISE CHARACTERIZATION OF A CCD CAMERA: APPLICATION TO A MULTISPECTRAL IMAGING SYSTEM

SYSTEMATIC NOISE CHARACTERIZATION OF A CCD CAMERA: APPLICATION TO A MULTISPECTRAL IMAGING SYSTEM SYSTEMATIC NOISE CHARACTERIZATION OF A CCD CAMERA: APPLICATION TO A MULTISPECTRAL IMAGING SYSTEM A. Mansouri, F. S. Marzani, P. Gouton LE2I. UMR CNRS-5158, UFR Sc. & Tech., University of Burgundy, BP 47870,

More information

Simulation comparisons of monitoring strategies in narrow bandpass filters and antireflection coatings

Simulation comparisons of monitoring strategies in narrow bandpass filters and antireflection coatings Simulation comparisons of monitoring strategies in narrow bandpass filters and antireflection coatings Ronald R. Willey Willey Optical, 13039 Cedar St., Charlevoix, Michigan 49720, USA (ron@willeyoptical.com)

More information

Laboratory 1: Uncertainty Analysis

Laboratory 1: Uncertainty Analysis University of Alabama Department of Physics and Astronomy PH101 / LeClair May 26, 2014 Laboratory 1: Uncertainty Analysis Hypothesis: A statistical analysis including both mean and standard deviation can

More information

Pipeline for illumination correction of images for high-throughput microscopy

Pipeline for illumination correction of images for high-throughput microscopy Journal of Microscopy, Vol. 00, Issue 0 2014, pp. 1 6 Received 10 April 2014; accepted 12 August 2014 doi: 10.1111/jmi.12178 Pipeline for illumination correction of images for high-throughput microscopy

More information

Be aware that there is no universal notation for the various quantities.

Be aware that there is no universal notation for the various quantities. Fourier Optics v2.4 Ray tracing is limited in its ability to describe optics because it ignores the wave properties of light. Diffraction is needed to explain image spatial resolution and contrast and

More information

Aesthetically Pleasing Azulejo Patterns

Aesthetically Pleasing Azulejo Patterns Bridges 2009: Mathematics, Music, Art, Architecture, Culture Aesthetically Pleasing Azulejo Patterns Russell Jay Hendel Mathematics Department, Room 312 Towson University 7800 York Road Towson, MD, 21252,

More information

A Short History of Using Cameras for Weld Monitoring

A Short History of Using Cameras for Weld Monitoring A Short History of Using Cameras for Weld Monitoring 2 Background Ever since the development of automated welding, operators have needed to be able to monitor the process to ensure that all parameters

More information

Camera Test Protocol. Introduction TABLE OF CONTENTS. Camera Test Protocol Technical Note Technical Note

Camera Test Protocol. Introduction TABLE OF CONTENTS. Camera Test Protocol Technical Note Technical Note Technical Note CMOS, EMCCD AND CCD CAMERAS FOR LIFE SCIENCES Camera Test Protocol Introduction The detector is one of the most important components of any microscope system. Accurate detector readings

More information

Compensation of Analog-to-Digital Converter Nonlinearities using Dither

Compensation of Analog-to-Digital Converter Nonlinearities using Dither Ŕ periodica polytechnica Electrical Engineering and Computer Science 57/ (201) 77 81 doi: 10.11/PPee.2145 http:// periodicapolytechnica.org/ ee Creative Commons Attribution Compensation of Analog-to-Digital

More information

System Identification and CDMA Communication

System Identification and CDMA Communication System Identification and CDMA Communication A (partial) sample report by Nathan A. Goodman Abstract This (sample) report describes theory and simulations associated with a class project on system identification

More information

Nature Methods: doi: /nmeth Supplementary Figure 1. Resolution of lysozyme microcrystals collected by continuous rotation.

Nature Methods: doi: /nmeth Supplementary Figure 1. Resolution of lysozyme microcrystals collected by continuous rotation. Supplementary Figure 1 Resolution of lysozyme microcrystals collected by continuous rotation. Lysozyme microcrystals were visualized by cryo-em prior to data collection and a representative crystal is

More information

An Efficient Noise Removing Technique Using Mdbut Filter in Images

An Efficient Noise Removing Technique Using Mdbut Filter in Images IOSR Journal of Electronics and Communication Engineering (IOSR-JECE) e-issn: 2278-2834,p- ISSN: 2278-8735.Volume 10, Issue 3, Ver. II (May - Jun.2015), PP 49-56 www.iosrjournals.org An Efficient Noise

More information

Novel Histogram Processing for Colour Image Enhancement

Novel Histogram Processing for Colour Image Enhancement Novel Histogram Processing for Colour Image Enhancement Jiang Duan and Guoping Qiu School of Computer Science, The University of Nottingham, United Kingdom Abstract: Histogram equalization is a well-known

More information

Image Processing for feature extraction

Image Processing for feature extraction Image Processing for feature extraction 1 Outline Rationale for image pre-processing Gray-scale transformations Geometric transformations Local preprocessing Reading: Sonka et al 5.1, 5.2, 5.3 2 Image

More information

Very short introduction to light microscopy and digital imaging

Very short introduction to light microscopy and digital imaging Very short introduction to light microscopy and digital imaging Hernan G. Garcia August 1, 2005 1 Light Microscopy Basics In this section we will briefly describe the basic principles of operation and

More information

Objective Evaluation of Edge Blur and Ringing Artefacts: Application to JPEG and JPEG 2000 Image Codecs

Objective Evaluation of Edge Blur and Ringing Artefacts: Application to JPEG and JPEG 2000 Image Codecs Objective Evaluation of Edge Blur and Artefacts: Application to JPEG and JPEG 2 Image Codecs G. A. D. Punchihewa, D. G. Bailey, and R. M. Hodgson Institute of Information Sciences and Technology, Massey

More information

Everything you always wanted to know about flat-fielding but were afraid to ask*

Everything you always wanted to know about flat-fielding but were afraid to ask* Everything you always wanted to know about flat-fielding but were afraid to ask* Richard Crisp 24 January 212 rdcrisp@earthlink.net www.narrowbandimaging.com * With apologies to Woody Allen Purpose Part

More information

3084 IEEE TRANSACTIONS ON NUCLEAR SCIENCE, VOL. 60, NO. 4, AUGUST 2013

3084 IEEE TRANSACTIONS ON NUCLEAR SCIENCE, VOL. 60, NO. 4, AUGUST 2013 3084 IEEE TRANSACTIONS ON NUCLEAR SCIENCE, VOL. 60, NO. 4, AUGUST 2013 Dummy Gate-Assisted n-mosfet Layout for a Radiation-Tolerant Integrated Circuit Min Su Lee and Hee Chul Lee Abstract A dummy gate-assisted

More information

Image Enhancement in Spatial Domain

Image Enhancement in Spatial Domain Image Enhancement in Spatial Domain 2 Image enhancement is a process, rather a preprocessing step, through which an original image is made suitable for a specific application. The application scenarios

More information

Effective and Efficient Fingerprint Image Postprocessing

Effective and Efficient Fingerprint Image Postprocessing Effective and Efficient Fingerprint Image Postprocessing Haiping Lu, Xudong Jiang and Wei-Yun Yau Laboratories for Information Technology 21 Heng Mui Keng Terrace, Singapore 119613 Email: hplu@lit.org.sg

More information

Image Extraction using Image Mining Technique

Image Extraction using Image Mining Technique IOSR Journal of Engineering (IOSRJEN) e-issn: 2250-3021, p-issn: 2278-8719 Vol. 3, Issue 9 (September. 2013), V2 PP 36-42 Image Extraction using Image Mining Technique Prof. Samir Kumar Bandyopadhyay,

More information

Colour analysis of inhomogeneous stains on textile using flatbed scanning and image analysis

Colour analysis of inhomogeneous stains on textile using flatbed scanning and image analysis Colour analysis of inhomogeneous stains on textile using flatbed scanning and image analysis Gerard van Dalen; Aat Don, Jegor Veldt, Erik Krijnen and Michiel Gribnau, Unilever Research & Development; P.O.

More information

Copyright 1997 by the Society of Photo-Optical Instrumentation Engineers.

Copyright 1997 by the Society of Photo-Optical Instrumentation Engineers. Copyright 1997 by the Society of Photo-Optical Instrumentation Engineers. This paper was published in the proceedings of Microlithographic Techniques in IC Fabrication, SPIE Vol. 3183, pp. 14-27. It is

More information

(Quantitative Imaging for) Colocalisation Analysis

(Quantitative Imaging for) Colocalisation Analysis (Quantitative Imaging for) Colocalisation Analysis or Why Colour Merge / Overlay Images are EVIL! Special course for DIGS-BB PhD program What is an Image anyway..? An image is a representation of reality

More information

Evolving High-Dimensional, Adaptive Camera-Based Speed Sensors

Evolving High-Dimensional, Adaptive Camera-Based Speed Sensors In: M.H. Hamza (ed.), Proceedings of the 21st IASTED Conference on Applied Informatics, pp. 1278-128. Held February, 1-1, 2, Insbruck, Austria Evolving High-Dimensional, Adaptive Camera-Based Speed Sensors

More information

Before you start, make sure that you have a properly calibrated system to obtain high-quality images.

Before you start, make sure that you have a properly calibrated system to obtain high-quality images. CONTENT Step 1: Optimizing your Workspace for Acquisition... 1 Step 2: Tracing the Region of Interest... 2 Step 3: Camera (& Multichannel) Settings... 3 Step 4: Acquiring a Background Image (Brightfield)...

More information

Radionuclide Imaging MII Single Photon Emission Computed Tomography (SPECT)

Radionuclide Imaging MII Single Photon Emission Computed Tomography (SPECT) Radionuclide Imaging MII 3073 Single Photon Emission Computed Tomography (SPECT) Single Photon Emission Computed Tomography (SPECT) The successful application of computer algorithms to x-ray imaging in

More information

How is the Digital Image Generated? Image Acquisition Devices

How is the Digital Image Generated? Image Acquisition Devices In order for image analysis to be performed on a 2D gel, it must first be converted into digital data. Good image capture is critical to guarantee optimal performance of automated image analysis packages

More information

(i) Understanding of the characteristics of linear-phase finite impulse response (FIR) filters

(i) Understanding of the characteristics of linear-phase finite impulse response (FIR) filters FIR Filter Design Chapter Intended Learning Outcomes: (i) Understanding of the characteristics of linear-phase finite impulse response (FIR) filters (ii) Ability to design linear-phase FIR filters according

More information

ECEN 4606, UNDERGRADUATE OPTICS LAB

ECEN 4606, UNDERGRADUATE OPTICS LAB ECEN 4606, UNDERGRADUATE OPTICS LAB Lab 2: Imaging 1 the Telescope Original Version: Prof. McLeod SUMMARY: In this lab you will become familiar with the use of one or more lenses to create images of distant

More information

Image Processing Based Vehicle Detection And Tracking System

Image Processing Based Vehicle Detection And Tracking System Image Processing Based Vehicle Detection And Tracking System Poonam A. Kandalkar 1, Gajanan P. Dhok 2 ME, Scholar, Electronics and Telecommunication Engineering, Sipna College of Engineering and Technology,

More information

Resampling in hyperspectral cameras as an alternative to correcting keystone in hardware, with focus on benefits for optical design and data quality

Resampling in hyperspectral cameras as an alternative to correcting keystone in hardware, with focus on benefits for optical design and data quality Resampling in hyperspectral cameras as an alternative to correcting keystone in hardware, with focus on benefits for optical design and data quality Andrei Fridman Gudrun Høye Trond Løke Optical Engineering

More information

FIBER OPTICS. Prof. R.K. Shevgaonkar. Department of Electrical Engineering. Indian Institute of Technology, Bombay. Lecture: 24. Optical Receivers-

FIBER OPTICS. Prof. R.K. Shevgaonkar. Department of Electrical Engineering. Indian Institute of Technology, Bombay. Lecture: 24. Optical Receivers- FIBER OPTICS Prof. R.K. Shevgaonkar Department of Electrical Engineering Indian Institute of Technology, Bombay Lecture: 24 Optical Receivers- Receiver Sensitivity Degradation Fiber Optics, Prof. R.K.

More information

Centre for Computational and Numerical Studies, Institute of Advanced Study in Science and Technology 2. Dept. of Statistics, Gauhati University

Centre for Computational and Numerical Studies, Institute of Advanced Study in Science and Technology 2. Dept. of Statistics, Gauhati University Cervix Cancer Diagnosis from Pap Smear Images Using Structure Based Segmentation and Shape Analysis 1 Lipi B. Mahanta, 2 Dilip Ch. Nath, 1 Chandan Kr. Nath 1 Centre for Computational and Numerical Studies,

More information

Nadir Margins in TerraSAR-X Timing Commanding

Nadir Margins in TerraSAR-X Timing Commanding CEOS SAR Calibration and Validation Workshop 2008 1 Nadir Margins in TerraSAR-X Timing Commanding S. Wollstadt and J. Mittermayer, Member, IEEE Abstract This paper presents an analysis and discussion of

More information

Imaging Particle Analysis: The Importance of Image Quality

Imaging Particle Analysis: The Importance of Image Quality Imaging Particle Analysis: The Importance of Image Quality Lew Brown Technical Director Fluid Imaging Technologies, Inc. Abstract: Imaging particle analysis systems can derive much more information about

More information

Automatic Locating the Centromere on Human Chromosome Pictures

Automatic Locating the Centromere on Human Chromosome Pictures Automatic Locating the Centromere on Human Chromosome Pictures M. Moradi Electrical and Computer Engineering Department, Faculty of Engineering, University of Tehran, Tehran, Iran moradi@iranbme.net S.

More information

Chapter 12 Image Processing

Chapter 12 Image Processing Chapter 12 Image Processing The distance sensor on your self-driving car detects an object 100 m in front of your car. Are you following the car in front of you at a safe distance or has a pedestrian jumped

More information

IncuCyte ZOOM Scratch Wound Processing Overview

IncuCyte ZOOM Scratch Wound Processing Overview IncuCyte ZOOM Scratch Wound Processing Overview The IncuCyte ZOOM Scratch Wound assay utilizes the WoundMaker-IncuCyte ZOOM-ImageLock Plate system to analyze both 2D-migration and 3D-invasion in label-free,

More information

Making a Panoramic Digital Image of the Entire Northern Sky

Making a Panoramic Digital Image of the Entire Northern Sky Making a Panoramic Digital Image of the Entire Northern Sky Anne M. Rajala anne2006@caltech.edu, x1221, MSC #775 Mentors: Ashish Mahabal and S.G. Djorgovski October 3, 2003 Abstract The Digitized Palomar

More information

Electronic Noise Effects on Fundamental Lamb-Mode Acoustic Emission Signal Arrival Times Determined Using Wavelet Transform Results

Electronic Noise Effects on Fundamental Lamb-Mode Acoustic Emission Signal Arrival Times Determined Using Wavelet Transform Results DGZfP-Proceedings BB 9-CD Lecture 62 EWGAE 24 Electronic Noise Effects on Fundamental Lamb-Mode Acoustic Emission Signal Arrival Times Determined Using Wavelet Transform Results Marvin A. Hamstad University

More information

FRAUNHOFER AND FRESNEL DIFFRACTION IN ONE DIMENSION

FRAUNHOFER AND FRESNEL DIFFRACTION IN ONE DIMENSION FRAUNHOFER AND FRESNEL DIFFRACTION IN ONE DIMENSION Revised November 15, 2017 INTRODUCTION The simplest and most commonly described examples of diffraction and interference from two-dimensional apertures

More information

4.5 Fractional Delay Operations with Allpass Filters

4.5 Fractional Delay Operations with Allpass Filters 158 Discrete-Time Modeling of Acoustic Tubes Using Fractional Delay Filters 4.5 Fractional Delay Operations with Allpass Filters The previous sections of this chapter have concentrated on the FIR implementation

More information