GEOI 313: Digital Image Processing - I

Size: px
Start display at page:

Download "GEOI 313: Digital Image Processing - I"

Transcription

1 GEOI 33: Digital Image Processing - I Module I: Image Representation Image Display For remote sensing computing, the image display is especially important because the analyst must be able to examine images and inspect results of analysis, which often are themselves images. At the simplest level, an image display can be thought of as a high quality television screen, although those tailored specifically for image processing have image display processors, which are special computers designed to receive rapidly digital data from the main computer and display them as brightness on the screen. The capabilities of an image display are determined by several factors. First is the size of the image it can display, usually specified by the number of rows and columns it can show at any one time. A large display might show,024 rows and,280 columns. A smaller display with respectable capabilities could show a,024 x 708 image; others of much more limited capabilities could show only smaller sizes, perhaps 640 x 480 (typical of display for the IBM PC). Second, a display has a given radiometric resolution. That is, for each pixel, it has a capability to show a range of brightness. One-bit resolution would give the capability to show either black or white-certainly not enough detail to be useful for most purposes. In practice, six bits (64 brightness levels) are probably necessary for images to appear 'natural', and high-quality displays typically display eight bits (256 brightness levels) or more. A third factor controls the rendition of color in the displayed images. The method of depicting color is closely related to the design of the image display and the display processor. Image display data are held in the frame buffer-a large segment of computer memory dedicated to handling data for display. The frame buffer provides one or more bits to record the brightness of each pixel to be shown on the screen (the bit plane); thus, the displayed image is generated, bit by bit, in the frame buffer. The more bits that have been designed in the frame buffer for each pixel, the greater the range of brightnesses that can be shown for that pixel, as explained above. For actual display on the screen, the digital value for each pixel is converted into an electrical signal that controls the brightness of the pixel on the screen. This requires a digital-to-analog (D-to-A) converter that translates discrete digital values into continuous electrical signals (the opposite function of the A-to-D converter mentioned previously). Different types of images and acquisition Multispectral Scanning There are two main modes or methods of scanning employed to acquire multispectral image data - across-track scanning, and along-track scanning. Across-track scanners scan the Earth in a series of lines. The lines are oriented perpendicular to the direction of motion of the sensor platform (i.e. across the swath). Each line is scanned from one side of the sensor to the other, using a rotating mirror (A). As the platform moves forward over the Earth, successive scans build up a two-dimensional image of the Earth s surface. The incoming reflected or emitted radiation is separated into several spectral components that are detected independently. The UV, visible, near-infrared, and thermal radiation are dispersed into their constituent wavelengths. A bank of internal detectors (B), each sensitive to a specific range of wavelengths, detects and

2 measures the energy for each spectral band and then, as an electrical signal, they are converted to digital data and recorded for subsequent computer processing. The IFOV (C) of the sensor and the altitude of the platform determine the ground resolution cell viewed (D), and thus the spatial resolution. The angular field of view (E) is the sweep of the mirror, measured in degrees, used to record a scan line, and determines the width of the imaged swath (F). Airborne scanners typically sweep large angles (between 90º and 20º), while satellites, because of their higher altitude need only to sweep fairly small angles (0-20º) to cover a broad region. Because the distance from the sensor to the target increases towards the edges of the swath, the ground resolution cells also become larger and introduce geometric distortions to the images. Also, the length of time the IFOV "sees" a ground resolution cell as the rotating mirror scans (called the dwell time), is generally quite short and influences the design of the spatial, spectral, and radiometric resolution of the sensor. Along-track scanners also use the forward motion of the platform to record successive scan lines and build up a two-dimensional image, perpendicular to the flight direction. However, instead of a scanning mirror, they use a linear array of detectors (A) located at the focal plane of the image (B) formed by lens systems (C), which are "pushed" along in the flight track direction (i.e. along track). These systems are also referred to as pushbroom scanners, as the motion of the detector array is analogous to the bristles of a broom being pushed along a floor. Each individual detector measures the energy for a single ground resolution cell (D) and thus the size and IFOV of the detectors determines the spatial resolution of the system. A separate linear array is required to measure each spectral band or channel. For each scan line, the energy detected by each detector of each linear array is sampled electronically and digitally recorded. Along-track scanners with linear arrays have several advantages over across-track mirror scanners. The array of detectors combined with the pushbroom motion allows each detector to "see" and measure the energy from each ground resolution cell for a longer period of time (dwell time). This allows more energy to be detected and improves the radiometric resolution. The increased dwell time also facilitates smaller IFOVs and narrower bandwidths for each detector. Thus, finer spatial and spectral resolution can be achieved without impacting radiometric resolution. Because detectors are usually solid-state microelectronic devices, they are generally smaller, lighter, require less power, and are more reliable and last longer because they have no moving parts. On the other hand, cross-calibrating thousands of detectors to achieve uniform sensitivity across the array is necessary and complicated.

3 Module II: Image Enhancement-I Contrast manipulation Contrast stretching This technique involves the translation of the image pixel values from the observed range DNmin to DNmax to the full range of the display device (generally 0-255, which is the range of values representable in an 8bit display devices) lacking in contrast or over bright. There are several types of contrast enhancements which can be subdivided into Linear and Non-Linear procedures. Linear Contrast Stretch: This technique involves the translation of the image pixel values from the observed range DNmin to DNmax to the full range of the display device(generally 0-255, which is the range of values representable in an 8bit display devices)this technique can be applied to a single band, grey-scale image, where the image data are mapped to the display via all three colors LUTs. It is not necessary to stretch between DNmax and DNmin - Inflection points for a linear contrast stretch from the 5th and 95th percentiles, or ± 2 standard deviations from the mean (for instance) of the histogram, or to cover the class of land cover of interest (e.g. water at expense of land or vice versa). It is also straightforward to have more than two inflection points in a linear stretch, yielding a piecewise linear stretch. Equalized Contrast Stretch (Histogram): The underlying principle of histogram equalisation is straightforward and simple, it is assumed that each level in the displayed image should contain an approximately equal number of pixel values, so that the histogram of these displayed values is almost uniform (though not all 256 classes are necessarily occupied). The objective of the histogram equalisation is to spread the range of pixel values present in the input image over the full range of the display device.

4 Gaussian Stretch This method of contrast enhancement is based upon the histogram of the pixel values is called a Gaussian stretch because it involves the fitting of the observed histogram to a normal or Gaussian histogram. Gray-level thresholding: Gray level thresholding is used to segment an input image into two classes one for those pixels having values below an analyst define gray level and one for those above this value. Below, we illustrate the use of thresholding to prepare a binary mask for an image. Such masks are used to segment an image into two classes so that additional processing can then be applied to each class independently. Level slicing Level (Density) Slicing is the mapping of a range of contiguous grey levels of a single band image to a point in the RGB color cube. The DNs of a given band are "sliced" into distinct classes. For example, for band 4 of a TM 8 bit image, we might divide the continuous range into discrete intervals of 0-63, 64-27, 28-9 and These four classes are displayed as four different grey levels. This kind of level slicing is often used in displaying temperature maps. Spatial feature manipulation Spatial Filtering Spatial Filtering can be described as selectively emphasizing or suppressing information at different spatial scales over an image. Filtering techniques can be implemented through the Fourier transform in the frequency domain or in the spatial domain by convolution. i) Convolution Filters Filtering methods exists is based upon the transformation of the image into its scale or spatial frequency components using the Fourier transform. The spatial domain filters

5 or the convolution filters are generally classed as either high-pass (sharpening) or as low-pass (smoothing) filters. ii) Low-Pass (Smoothing) Filters Low-pass filters reveal underlying two-dimensional waveform with a long wavelength or low frequency image contrast at the expense of higher spatial frequencies. Low-frequency information allows the identification of the background pattern, and produces an output image in which the detail has been smoothed or removed from the original. A 2-dimensional moving-average filter is defined in terms of its dimensions which must be odd, positive and integral but not necessarily equal, and its coefficients. The output DN is found by dividing the sum of the products of corresponding convolution kernel and image elements often divided by the number of kernel elements. A similar effect is given from a median filter where the convolution kernel is a description of the PSF weights. Choosing the median value from the moving window does a better job of suppressing noise and preserving edges than the mean filter. Adaptive filters have kernel coefficients calculated for each window position based on the mean and variance of the original DN in the underlying image. iii) High-Pass (Sharpening) Filters Simply subtracting the low-frequency image resulting from a low pass filter from the original image can enhance high spatial frequencies. High -frequency information allows us either to isolate or to amplify the local detail. If the high-frequency detail is amplified by adding back to the image some multiple of the high frequency component extracted by the filter, then the result is a sharper, de-blurred image. High-pass convolution filters can be designed by representing a PSF with positive centre weightr and negative surrounding weights. A typical 3x3 Laplacian filter has a kernal with a high central value, 0 at each corner, and - at the centre of each edge. Such filters can be biased in certain directions for enhancement of edges. A high-pass filtering can be performed simply based on the mathematical concepts of derivatives, i.e., gradients in DN throughout the image. Since images are not continuous functions, calculus is dispensed with and instead derivatives are estimated from the differences in the DN of adjacent pixels in the x,y or diagonal directions. Directional first differencing aims at emphasising edges in image.

6 Edge enhancement Most interpreters are concerned with recognizing linear features in images. Geologists map faults, joints, and lineaments. Geographers map manmade linear features such as highways and canals. Some linear features occur as narrow lines against a background of contrasting brightness; others are the linear contact between adjacent areas of different brightness. In all cases, linear features are formed by edges. Some edges are marked by pronounced differences in brightness and are readily recognized. More typically, however, edges are marked by subtle brightness differences that may be difficult to recognize. Contrast enhancement may emphasize brightness differences associated with some linear features. This procedure, however, is not specific for linear features because all elements of the scene are enhanced equally, not just the linear elements. Fourier analysis An image is separated into its various spatial frequency components through application of a mathematical operation known as the Fourier transforms. This operation is amounts to fitting a continuous function through the discrete DN values if they were plotted along each row and column in an image. The 'peaks and valleys" along any given row and column can be describe mathematically by a combination of sine and cosine waves with various amplitudes, frequencies, and phases. A Fourier transformation results from the calculation of the amplitude and phase for each possible spatial frequency in an image. After image is separated into its component spatial frequencies, it is possible to display these values in a two dimensional scatter plot known as a Fourier spectrum. Fourier analysis is useful in the host of image processing operations in addition to the spatial filtering and image restoration applications.

7 Module III: Image Enhancement-II Multi-image manipulation Image Arithmetic Operations The operations of addition, subtraction, multiplication and division are performed on two or more co-registered images of the same geographical area. These techniques are applied to images from separate spectral bands from single multispectral data set or they may be individual bands from image data sets that have been collected at different dates. More complicated algebra is sometimes encountered in derivation of sea-surface temperature from multispectral thermal infrared data (so called splitwindow and multichannel techniques). Addition of images is generally carried out to give dynamic range of image that equals the input images. Band Subtraction Operation on images is sometimes carried out to co-register scenes of the same area acquired at different times for change detection. Multiplication of images normally involves the use of a single'real' image and binary image made up of ones and zeros. Band Ratioing or Division of images is probably the most common arithmetic operation that is most widely applied to images in geological, ecological and agricultural applications of remote sensing. Ratio Images are enhancements resulting from the division of DN values of one spectral band by corresponding DN of another band. One instigation for this is to iron out differences in scene illumination due to cloud or topographic shadow. Ratio images also bring out spectral variation in different target materials. Multiple ratio image can be used to drive red, green and blue monitor guns for color images. Interpretation of ratio images must consider that they are "intensity blind", i.e, dissimilar materials with different absolute reflectances but similar relative reflectances in the two or more utilised bands will look the same in the output image. Principal components Canonical components PCA is appropriate when little prior information about the scene is available. Canonical component analysis, also referred to as multiple discriminant analysis, may be appropriate when information about particular features of interest is available. Canonical component axes are located to maximize the separability of different userdefined feature types.

8 Vegetation components: Live green plants absorb solar radiation in the photosynthetically active radiation (PAR) spectral region, which they use as a source of energy in the process of photosynthesis. Leaf cells have also evolved to scatter (i.e., reflect and transmit) solar radiation in the near-infrared spectral region (which carries approximately half of the total incoming solar energy), because the energy level per photon in that domain (wavelengths longer than about 700 nanometers) is not sufficient to be useful to synthesize organic molecules. A strong absorption at these wavelengths would only result in overheating the plant and possibly damaging the tissues. Hence, live green plants appear relatively dark in the PAR and relatively bright in the nearinfrared.[3] By contrast, clouds and snow tend to be rather bright in the red (as well as other visible wavelengths) and quite dark in the near-infrared. The pigment in plant leaves, chlorophyll, strongly absorbs visible light (from 0.4 to 0.7 µm) for use in photosynthesis. The cell structure of the leaves, on the other hand, strongly reflects near-infrared light (from 0.7 to. µm). The more leaves a plant has, the more these wavelengths of light are affected, respectively. Since early instruments of Earth Observation, such as NASA's ERTS and NOAA's AVHRR, acquired data in visible and near-infrared, it was natural to exploit the strong differences in plant reflectance to determine their spatial distribution in these satellite images. The NDVI is calculated from these individual measurements as follows: where VIS and NIR stand for the spectral reflectance measurements acquired in the visible (red) and near-infrared regions, respectively ( tion_2.php). These spectral reflectances are themselves ratios of the reflected over the incoming radiation in each spectral band individually; hence they take on values between 0.0 and.0. By design, the NDVI itself thus varies between -.0 and +.0. It should be noted that NDVI is functionally, but not linearly, equivalent to the simple infrared/red ratio (NIR/VIS). The advantage of NDVI over a simple infrared/red ratio is therefore generally limited to any possible linearity of its functional relationship with vegetation properties (e.g. biomass). The simple ratio (unlike NDVI) is always positive, which may have practical advantages, but it also has a mathematically infinite range (0 to infinity), which can be a practical disadvantage as compared to NDVI. Also in this regard, note that the VIS term in the numerator of NDVI only scales the result, thereby creating negative values. NDVI is functionally and linearly equivalent to the ratio NIR / (NIR+VIS), which ranges from 0 to and is thus never negative nor limitless in range. [4] But the most important concept in the understanding of the NDVI algebraic formula is that, despite its name, it is a

9 transformation of a spectral ratio (NIR/VIS), and it has no functional relationship to a spectral difference (NIR-VIS). In general, if there is much more reflected radiation in near-infrared wavelengths than in visible wavelengths, then the vegetation in that pixel is likely to be dense and may contain some type of forest. Subsequent work has shown that the NDVI is directly related to the photosynthetic capacity and hence energy absorption of plant canopies. Intensity-hue-saturation (HIS) colour space transformation Hue is generated by mixing red, green and blue light are characterised by coordinates on the red, green and blue axes of the color cube. The hue-saturation-intensity hexcone model, where hue is the dominant wavelength of the perceived color represented by angular position around the top of a hexcone, saturation or purity is given by distance from the central, vertical axis of the hexcone and intensity or value is represented by distance above the apex of the hexcone. Hue is what we perceive as color. Saturation is the degree of purity of the color and may be considered to be the amount of white mixed in with the color. It is sometimes useful to convert from RGB color cube coordinates to HIS hexcone coordinates and vice-versa The hue, saturation and intensity transform is useful in two ways: first as method of image enhancement and secondly as a means of combining co-registered images from different sources. The advantage of the HIS system is that it is a more precise representation of human color vision than the RGB system. This transformation has been quite useful for geological applications. Module IV: Image Analysis Digital Analysis - image rectification and restoration: Radiometric, atmospheric and geometric corrections, correction in the spatial spectrum of the images Radiometric and Geometric Corrections Introduction In their raw form as received from imaging sensors mounted on satellite platforms, remotely sensed data may contain flaws or deficiencies. The correction of deficiencies and removal of flaws present in the data is termed as pre-processing. Image preprocessing can be classified into three functional categories i) Radiometric corrections ii) Atmospheric corrections iii) Geometric corrections The intent of image correction is to correct image data for distortions or degradations that stem from the image acquisition process. Image Radiometry generally refers to the digital representation of the sensed data, while radiometric correction involves the

10 rearrangement of the digital numbers (DN) in an image so that all areas in the image have the same linear relationship between the DN and either radiance or backscatter. Digital Number (DN value) is also known as pixel value. Image geometry refers to the projection, scale and orientation of the image, while geometric correction refers to the modification of the input geometry to achieve the desired geometry. Radiometric Corrections Radiometric errors are caused by detector imbalance and atmospheric deficiencies. Radiometric corrections are transformations on the data in order to remove errors, which are geometry independent. Radiometric corrections are also called as cosmetic corrections and are done to improve the visual appearance of the image. Multiple detectors are used in the sensor system to simultaneously sense several image lines during each sweep of the mirror. This configuration requires an array of 24 detectors (6 lines x 4 bands) in case of MSS. As the detectors are not precisely equivalent in their output characteristics, their output changes gradually over time. Due to these variations there will be different output for the same ground radiance To accomplish this, the scanner views an electrically illuminated step wedge filter during each mirror sweep. Once per orbit, the scanner views the sun to provide a more absolute calibration. These calibration values are used to develop radiometric correction functions for each detector. The correction functions yield digital numbers that correspond linearly with radiance and are applied to all data prior to dissemination. Some of the radiometric distortions are as follows () Correction for missing lines (2) Correction for periodic line striping (3) Random noise correction (4) Atmospheric correction. Correction for Missing Scan Lines (Scan line drop out): Although detectors onboard orbiting satellites are well tested and calibrated before launch, breakdown of any of the detectors may take place. Such defects are due to errors in the scanning or sampling equipment, in the transmission or recording of image data or in reproduction of CCT's. The missing scan lines are seen as horizontal black (pixel value 0) or white (pixel value 255) lines on the image. Techniques are available to locate these bad lines by selecting unusually large discrepancies in image values for sequential lines. The first step in the restoration process is to calculate the average DN value per scan line for entire scene. The average DN value for each scan line is then compared with scene average. Any scan line deviating from the average by more than a designated threshold value is identified as defective. Once detected, they may be cosmetically corrected in three ways Replacement by either the preceding or the succeeding line Averaging of the neighbouring pixel values Replacing the line with other highly correlated band. Correction for line striping (De-stripping): A sensor is called ideal when there is a linear relationship between input and the output. Although all the detectors are well calibrated prior to the launch, the response of some of the detectors may shift towards lower or higher end. The

11 presence of a systematic horizontal banding pattern is frequently seen on images produced by electronic scanners such as MSS sixth line banding and on TM sixteenth line banding. Banding is a cosmetic defect and it interferes with the visual appreciation of the patterns and features on the image. Hence corrections for these bandings are to be applied to improve the visual appearance and interpretability of the image. Two methods of de-stripping are considered, both these methods are based upon the shape of the histograms of pixel values generated by the individual detectors in a particular band. Atmospheric correction: The value recorded at any pixel location on the remotely sensed image is not a record of thetrue ground-leaving radiance at that point, for the signal is attenuated due to absorption and its directional properties are altered by scattering. Figure 8 depicts the effects the atmosphere has on the measured brightness values of a single pixel for a passive remote sensing system. Scattering at S2 redirects some of the incident radiance within the atmosphere in the field of view of the sensor (the atmospheric path radiance) and some of the energy reflected from point Q is scattered at S so that it is seen as coming from P. To add to these effects the radiance from P and Q is attenuated as it passes through the atmosphere. Other difficulties are caused by the variations in the illumination geometry (Sun s elevation and azimuth angles). The relationship between radiance received at a sensor above the atmosphere and the radiance leaving the ground surface can be given as Ls = Htot ρ T + Lp Htot = total downwelling radiance in a specified spectral band ρ = reflectance of the target T = atmospheric transmittance Lp = atmospheric path radiance Geometric Errors and Corrections Remotely sensed data usually contains both systematic and unsystematic geometric errors. Distortions whose effects are systematic in nature and are constant and can be predicted in advance are called systematic distortions. Systematic distortions are Scan skew, Mirror-scan velocity, Panoramic distortions and Non-systematic distortions include errors due to platform Altitude and Attitude, Platform velocity, Earth rotation, perspective projection(fig 9). More over remotely sensed images are not maps. The transformation of a remotely sensed image so that it has a scale and projections of a map is called geometric correction. A related technique, called registration, is the fitting of the coordinate system of an image to that of second image of the same area. These errors can be divided into two classes (a) those that can be corrected using data from platform ephemeris and knowledge of internal sensor distortion (b) those that cannot be corrected with acceptable accuracy without a sufficient number of ground control points (GCP). Distortion evaluated from tracking data

12 . Earth Rotation: As the scanning mirror completes successive scans, the earth rotates beneath the sensor. Thus there is a gradual westward shift of the ground swath being scanned. This causes along-scan distortion. To give the pixels their correct position relative to the ground it is necessary to offset the bottom of the image to the west by the amount of movement of the ground during image acquisition. The amount by which the image has to be skewed to the west depends upon the relative velocities of the satellite and earth and the length of the image frame recorded. 2. Spacecraft Velocity: If the spacecraft velocity departs from nominal, the ground track covered by a fixed number of successive mirror sweep changes. This produces along-track scale distortion. 3. Scan- Time Skew: During the time required for the scanning mirror to complete an active scan, the spacecraft moves along the ground track. Thus, the ground swath scanned is not normal to the ground track but is slightly skewed, which produces crossscan geometric distortion. The known velocity of the satellite is used to restore this geometric distortion. The magnitude of correction is km for MSS. 4. Sensor Mirror Sweep: The mirror-scanning rate varies non-linearly across a scan because of imperfections in the electro mechanical driving mechanism. Since data samples are taken at regular intervals of time, the varying scan rate produces alongscan distortions. The magnitude of the correction is 0.37 km for MSS. 5. Panoramic Distortions: For scanners used on space borne and airborne remote sensing platforms the angular instantaneous field of view (IFOV) is constant. As a result the effective pixel size on the ground is larger at the extremities of the scan line than at the nadir. It produces along-scan distortion. If the instantaneous field of view (IFOV) is β and the pixel dimension at nadir is p, then its dimension in the scan direction at a scan angle of θ is pθ = βh sec2θ = p sec2θ where h is altitude. 6. Perspective projection: For some applications it is desired that Landsat images represent the projection of points on the earth upon a plane tangent to the earth, with all projection lines normal to the plane. The sensor data represent perspective projections, projections whose all lines meet at a point above the tangent plane. For the MSS, this produces only along-scan distortion. Distortion evaluated from ground control. Altitude: Departures of the spacecraft altitude from nominal produces scaledistortions in the sensor data. For MSS, the distortion is along-scan only and varies with time. The magnitude of correction is upto.5 km for MSS. 2. Attitude: Normally, this sensor axis system is maintained with one axis normal to the Earth's surface and another parallel to the spacecraft velocity vector. As the sensor departs from this attitude, geometric error results. Roll and pitch errors shift the image linearly. Yaw error rotates each image line about its center. Maximum shift occurs to the edge pixels under yaw. For LISS- II, a roll error of 0. degree will shift the image line by.57 km across the track. For pitch error of same magnitude, the line gets shifted along the track by.57 km.

13 Module V: Image Analysis Principal component Analysis and Discriminate analysis Spectrally adjacent bands in a multispectral remotely sensed image are often highly correlated. Multiband visible/near-infrared images of vegetated areas will show negative correlations between the near-infrared and visible red bands and positive correlations among the visible bands because the spectral characteristics of vegetation are such that as the vigour or greenness of the vegetation increases the red reflectance diminishes and the near-infrared reflectance increases. Thus presence of correlations among the bands of a multispectral image implies that there is redundancy in the data and Principal Component Analysis aims at removing this redundancy. Principal Components Analysis (PCA) is related to another statistical technique called factor analysis and can be used to transform a set of image bands such that the new bands (called principal components) are uncorrelated with one another and are ordered in terms of the amount of image variation they explain. The components are thus a statistical abstraction of the variability inherent in the original band set. To transform the original data onto the new principal component axes, transformation coefficients (eigen values and eigen vectors) are obtained that are further applied in alinear fashion to the original pixel values. This linear transformation is derived from the covariance matrix of the original data set. These transformation coefficients describe the lengths and directions of the principal axes. Such transformations are generally applied either as an enhancement operation, or prior to classification of data. In the context of PCA, information means variance or scatter about the mean. Multispectral data generally have a dimensionality that is less than the number of spectral bands. The purpose of PCA is to define the dimensionality and to fix the coefficients that specify the set of axes, which point in the directions of greatest variability. The bands of PCA are often more interpretable than the source data.

746A27 Remote Sensing and GIS. Multi spectral, thermal and hyper spectral sensing and usage

746A27 Remote Sensing and GIS. Multi spectral, thermal and hyper spectral sensing and usage 746A27 Remote Sensing and GIS Lecture 3 Multi spectral, thermal and hyper spectral sensing and usage Chandan Roy Guest Lecturer Department of Computer and Information Science Linköping University Multi

More information

Image interpretation and analysis

Image interpretation and analysis Image interpretation and analysis Grundlagen Fernerkundung, Geo 123.1, FS 2014 Lecture 7a Rogier de Jong Michael Schaepman Why are snow, foam, and clouds white? Why are snow, foam, and clouds white? Today

More information

Remote Sensing. The following figure is grey scale display of SPOT Panchromatic without stretching.

Remote Sensing. The following figure is grey scale display of SPOT Panchromatic without stretching. Remote Sensing Objectives This unit will briefly explain display of remote sensing image, geometric correction, spatial enhancement, spectral enhancement and classification of remote sensing image. At

More information

Preparing Remote Sensing Data for Natural Resources Mapping (image enhancement, rectifications )

Preparing Remote Sensing Data for Natural Resources Mapping (image enhancement, rectifications ) Preparing Remote Sensing Data for Natural Resources Mapping (image enhancement, rectifications ) Why is this important What are the major approaches Examples of digital image enhancement Follow up exercises

More information

An Introduction to Remote Sensing & GIS. Introduction

An Introduction to Remote Sensing & GIS. Introduction An Introduction to Remote Sensing & GIS Introduction Remote sensing is the measurement of object properties on Earth s surface using data acquired from aircraft and satellites. It attempts to measure something

More information

Basic Digital Image Processing. The Structure of Digital Images. An Overview of Image Processing. Image Restoration: Line Drop-outs

Basic Digital Image Processing. The Structure of Digital Images. An Overview of Image Processing. Image Restoration: Line Drop-outs Basic Digital Image Processing A Basic Introduction to Digital Image Processing ~~~~~~~~~~ Rev. Ronald J. Wasowski, C.S.C. Associate Professor of Environmental Science University of Portland Portland,

More information

Application of GIS to Fast Track Planning and Monitoring of Development Agenda

Application of GIS to Fast Track Planning and Monitoring of Development Agenda Application of GIS to Fast Track Planning and Monitoring of Development Agenda Radiometric, Atmospheric & Geometric Preprocessing of Optical Remote Sensing 13 17 June 2018 Outline 1. Why pre-process remotely

More information

SECTION I - CHAPTER 2 DIGITAL IMAGING PROCESSING CONCEPTS

SECTION I - CHAPTER 2 DIGITAL IMAGING PROCESSING CONCEPTS RADT 3463 - COMPUTERIZED IMAGING Section I: Chapter 2 RADT 3463 Computerized Imaging 1 SECTION I - CHAPTER 2 DIGITAL IMAGING PROCESSING CONCEPTS RADT 3463 COMPUTERIZED IMAGING Section I: Chapter 2 RADT

More information

Course overview; Remote sensing introduction; Basics of image processing & Color theory

Course overview; Remote sensing introduction; Basics of image processing & Color theory GEOL 1460 /2461 Ramsey Introduction to Remote Sensing Fall, 2018 Course overview; Remote sensing introduction; Basics of image processing & Color theory Week #1: 29 August 2018 I. Syllabus Review we will

More information

Outline. Introduction. Introduction: Film Emulsions. Sensor Systems. Types of Remote Sensing. A/Prof Linlin Ge. Photographic systems (cf(

Outline. Introduction. Introduction: Film Emulsions. Sensor Systems. Types of Remote Sensing. A/Prof Linlin Ge. Photographic systems (cf( GMAT x600 Remote Sensing / Earth Observation Types of Sensor Systems (1) Outline Image Sensor Systems (i) Line Scanning Sensor Systems (passive) (ii) Array Sensor Systems (passive) (iii) Antenna Radar

More information

Remote sensing image correction

Remote sensing image correction Remote sensing image correction Introductory readings remote sensing http://www.microimages.com/documentation/tutorials/introrse.pdf 1 Preprocessing Digital Image Processing of satellite images can be

More information

Image Band Transformations

Image Band Transformations Image Band Transformations Content Band math Band ratios Vegetation Index Tasseled Cap Transform Principal Component Analysis (PCA) Decorrelation Stretch Image Band Transformation Purposes Image band transforms

More information

The techniques with ERDAS IMAGINE include:

The techniques with ERDAS IMAGINE include: The techniques with ERDAS IMAGINE include: 1. Data correction - radiometric and geometric correction 2. Radiometric enhancement - enhancing images based on the values of individual pixels 3. Spatial enhancement

More information

8.2 IMAGE PROCESSING VERSUS IMAGE ANALYSIS Image processing: The collection of routines and

8.2 IMAGE PROCESSING VERSUS IMAGE ANALYSIS Image processing: The collection of routines and 8.1 INTRODUCTION In this chapter, we will study and discuss some fundamental techniques for image processing and image analysis, with a few examples of routines developed for certain purposes. 8.2 IMAGE

More information

Some Basic Concepts of Remote Sensing. Lecture 2 August 31, 2005

Some Basic Concepts of Remote Sensing. Lecture 2 August 31, 2005 Some Basic Concepts of Remote Sensing Lecture 2 August 31, 2005 What is remote sensing Remote Sensing: remote sensing is science of acquiring, processing, and interpreting images and related data that

More information

Chapter 5. Preprocessing in remote sensing

Chapter 5. Preprocessing in remote sensing Chapter 5. Preprocessing in remote sensing 5.1 Introduction Remote sensing images from spaceborne sensors with resolutions from 1 km to < 1 m become more and more available at reasonable costs. For some

More information

Lecture 2. Electromagnetic radiation principles. Units, image resolutions.

Lecture 2. Electromagnetic radiation principles. Units, image resolutions. NRMT 2270, Photogrammetry/Remote Sensing Lecture 2 Electromagnetic radiation principles. Units, image resolutions. Tomislav Sapic GIS Technologist Faculty of Natural Resources Management Lakehead University

More information

Lecture Notes Prepared by Prof. J. Francis Spring Remote Sensing Instruments

Lecture Notes Prepared by Prof. J. Francis Spring Remote Sensing Instruments Lecture Notes Prepared by Prof. J. Francis Spring 2005 Remote Sensing Instruments Material from Remote Sensing Instrumentation in Weather Satellites: Systems, Data, and Environmental Applications by Rao,

More information

Mod. 2 p. 1. Prof. Dr. Christoph Kleinn Institut für Waldinventur und Waldwachstum Arbeitsbereich Fernerkundung und Waldinventur

Mod. 2 p. 1. Prof. Dr. Christoph Kleinn Institut für Waldinventur und Waldwachstum Arbeitsbereich Fernerkundung und Waldinventur Histograms of gray values for TM bands 1-7 for the example image - Band 4 and 5 show more differentiation than the others (contrast=the ratio of brightest to darkest areas of a landscape). - Judging from

More information

MODULE 4 LECTURE NOTES 4 DENSITY SLICING, THRESHOLDING, IHS, TIME COMPOSITE AND SYNERGIC IMAGES

MODULE 4 LECTURE NOTES 4 DENSITY SLICING, THRESHOLDING, IHS, TIME COMPOSITE AND SYNERGIC IMAGES MODULE 4 LECTURE NOTES 4 DENSITY SLICING, THRESHOLDING, IHS, TIME COMPOSITE AND SYNERGIC IMAGES 1. Introduction Digital image processing involves manipulation and interpretation of the digital images so

More information

Remote Sensing Exam 2 Study Guide

Remote Sensing Exam 2 Study Guide Remote Sensing Exam 2 Study Guide Resolution Analog to digital Instantaneous field of view (IFOV) f ( cone angle of optical system ) Everything in that area contributes to spectral response mixels Sampling

More information

An Introduction to Geomatics. Prepared by: Dr. Maher A. El-Hallaq خاص بطلبة مساق مقدمة في علم. Associate Professor of Surveying IUG

An Introduction to Geomatics. Prepared by: Dr. Maher A. El-Hallaq خاص بطلبة مساق مقدمة في علم. Associate Professor of Surveying IUG An Introduction to Geomatics خاص بطلبة مساق مقدمة في علم الجيوماتكس Prepared by: Dr. Maher A. El-Hallaq Associate Professor of Surveying IUG 1 Airborne Imagery Dr. Maher A. El-Hallaq Associate Professor

More information

Int n r t o r d o u d c u ti t on o n to t o Remote Sensing

Int n r t o r d o u d c u ti t on o n to t o Remote Sensing Introduction to Remote Sensing Definition of Remote Sensing Remote sensing refers to the activities of recording/observing/perceiving(sensing)objects or events at far away (remote) places. In remote sensing,

More information

RADIOMETRIC CALIBRATION

RADIOMETRIC CALIBRATION 1 RADIOMETRIC CALIBRATION Lecture 10 Digital Image Data 2 Digital data are matrices of digital numbers (DNs) There is one layer (or matrix) for each satellite band Each DN corresponds to one pixel 3 Digital

More information

GEOMETRIC RECTIFICATION OF EUROPEAN HISTORICAL ARCHIVES OF LANDSAT 1-3 MSS IMAGERY

GEOMETRIC RECTIFICATION OF EUROPEAN HISTORICAL ARCHIVES OF LANDSAT 1-3 MSS IMAGERY GEOMETRIC RECTIFICATION OF EUROPEAN HISTORICAL ARCHIVES OF LANDSAT -3 MSS IMAGERY Torbjörn Westin Satellus AB P.O.Box 427, SE-74 Solna, Sweden tw@ssc.se KEYWORDS: Landsat, MSS, rectification, orbital model

More information

Enhancement of Multispectral Images and Vegetation Indices

Enhancement of Multispectral Images and Vegetation Indices Enhancement of Multispectral Images and Vegetation Indices ERDAS Imagine 2016 Description: We will use ERDAS Imagine with multispectral images to learn how an image can be enhanced for better interpretation.

More information

Digital Image Processing

Digital Image Processing Digital Image Processing Part 2: Image Enhancement Digital Image Processing Course Introduction in the Spatial Domain Lecture AASS Learning Systems Lab, Teknik Room T26 achim.lilienthal@tech.oru.se Course

More information

Image transformations

Image transformations Image transformations Digital Numbers may be composed of three elements: Atmospheric interference (e.g. haze) ATCOR Illumination (angle of reflection) - transforms Albedo (surface cover) Image transformations

More information

NON-PHOTOGRAPHIC SYSTEMS: Multispectral Scanners Medium and coarse resolution sensor comparisons: Landsat, SPOT, AVHRR and MODIS

NON-PHOTOGRAPHIC SYSTEMS: Multispectral Scanners Medium and coarse resolution sensor comparisons: Landsat, SPOT, AVHRR and MODIS NON-PHOTOGRAPHIC SYSTEMS: Multispectral Scanners Medium and coarse resolution sensor comparisons: Landsat, SPOT, AVHRR and MODIS CLASSIFICATION OF NONPHOTOGRAPHIC REMOTE SENSORS PASSIVE ACTIVE DIGITAL

More information

Fig Color spectrum seen by passing white light through a prism.

Fig Color spectrum seen by passing white light through a prism. 1. Explain about color fundamentals. Color of an object is determined by the nature of the light reflected from it. When a beam of sunlight passes through a glass prism, the emerging beam of light is not

More information

Section 2 Image quality, radiometric analysis, preprocessing

Section 2 Image quality, radiometric analysis, preprocessing Section 2 Image quality, radiometric analysis, preprocessing Emmanuel Baltsavias Radiometric Quality (refers mostly to Ikonos) Preprocessing by Space Imaging (similar by other firms too): Modulation Transfer

More information

Atmospheric interactions; Aerial Photography; Imaging systems; Intro to Spectroscopy Week #3: September 12, 2018

Atmospheric interactions; Aerial Photography; Imaging systems; Intro to Spectroscopy Week #3: September 12, 2018 GEOL 1460/2461 Ramsey Introduction/Advanced Remote Sensing Fall, 2018 Atmospheric interactions; Aerial Photography; Imaging systems; Intro to Spectroscopy Week #3: September 12, 2018 I. Quick Review from

More information

Sommersemester Prof. Dr. Christoph Kleinn Institut für Waldinventur und Waldwachstum Arbeitsbereich Fernerkundung und Waldinventur.

Sommersemester Prof. Dr. Christoph Kleinn Institut für Waldinventur und Waldwachstum Arbeitsbereich Fernerkundung und Waldinventur. Basics of Remote Sensing Some literature references Franklin, SE 2001 Remote Sensing for Sustainable Forest Management Lewis Publishers 407p Lillesand, Kiefer 2000 Remote Sensing and Image Interpretation

More information

GE 113 REMOTE SENSING. Topic 7. Image Enhancement

GE 113 REMOTE SENSING. Topic 7. Image Enhancement GE 113 REMOTE SENSING Topic 7. Image Enhancement Lecturer: Engr. Jojene R. Santillan jrsantillan@carsu.edu.ph Division of Geodetic Engineering College of Engineering and Information Technology Caraga State

More information

CoE4TN4 Image Processing. Chapter 3: Intensity Transformation and Spatial Filtering

CoE4TN4 Image Processing. Chapter 3: Intensity Transformation and Spatial Filtering CoE4TN4 Image Processing Chapter 3: Intensity Transformation and Spatial Filtering Image Enhancement Enhancement techniques: to process an image so that the result is more suitable than the original image

More information

Introduction to Remote Sensing

Introduction to Remote Sensing Introduction to Remote Sensing Spatial, spectral, temporal resolutions Image display alternatives Vegetation Indices Image classifications Image change detections Accuracy assessment Satellites & Air-Photos

More information

Remote Sensing Platforms

Remote Sensing Platforms Types of Platforms Lighter-than-air Remote Sensing Platforms Free floating balloons Restricted by atmospheric conditions Used to acquire meteorological/atmospheric data Blimps/dirigibles Major role - news

More information

REMOTE SENSING. Topic 10 Fundamentals of Digital Multispectral Remote Sensing MULTISPECTRAL SCANNERS MULTISPECTRAL SCANNERS

REMOTE SENSING. Topic 10 Fundamentals of Digital Multispectral Remote Sensing MULTISPECTRAL SCANNERS MULTISPECTRAL SCANNERS REMOTE SENSING Topic 10 Fundamentals of Digital Multispectral Remote Sensing Chapter 5: Lillesand and Keifer Chapter 6: Avery and Berlin MULTISPECTRAL SCANNERS Record EMR in a number of discrete portions

More information

Digital Image Processing

Digital Image Processing Digital Image Processing 1 Patrick Olomoshola, 2 Taiwo Samuel Afolayan 1,2 Surveying & Geoinformatic Department, Faculty of Environmental Sciences, Rufus Giwa Polytechnic, Owo. Nigeria Abstract: This paper

More information

Image acquisition. Midterm Review. Digitization, line of image. Digitization, whole image. Geometric transformations. Interpolation 10/26/2016

Image acquisition. Midterm Review. Digitization, line of image. Digitization, whole image. Geometric transformations. Interpolation 10/26/2016 Image acquisition Midterm Review Image Processing CSE 166 Lecture 10 2 Digitization, line of image Digitization, whole image 3 4 Geometric transformations Interpolation CSE 166 Transpose these matrices

More information

Digital Image Processing. Lecture # 6 Corner Detection & Color Processing

Digital Image Processing. Lecture # 6 Corner Detection & Color Processing Digital Image Processing Lecture # 6 Corner Detection & Color Processing 1 Corners Corners (interest points) Unlike edges, corners (patches of pixels surrounding the corner) do not necessarily correspond

More information

Camera Requirements For Precision Agriculture

Camera Requirements For Precision Agriculture Camera Requirements For Precision Agriculture Radiometric analysis such as NDVI requires careful acquisition and handling of the imagery to provide reliable values. In this guide, we explain how Pix4Dmapper

More information

Image Extraction using Image Mining Technique

Image Extraction using Image Mining Technique IOSR Journal of Engineering (IOSRJEN) e-issn: 2250-3021, p-issn: 2278-8719 Vol. 3, Issue 9 (September. 2013), V2 PP 36-42 Image Extraction using Image Mining Technique Prof. Samir Kumar Bandyopadhyay,

More information

Govt. Engineering College Jhalawar Model Question Paper Subject- Remote Sensing & GIS

Govt. Engineering College Jhalawar Model Question Paper Subject- Remote Sensing & GIS Govt. Engineering College Jhalawar Model Question Paper Subject- Remote Sensing & GIS Time: Max. Marks: Q1. What is remote Sensing? Explain the basic components of a Remote Sensing system. Q2. What is

More information

CHAPTER 2 A NEW SCHEME FOR SATELLITE RAW DATA PROCESSING AND IMAGE REPRESENTATION

CHAPTER 2 A NEW SCHEME FOR SATELLITE RAW DATA PROCESSING AND IMAGE REPRESENTATION 40 CHAPTER 2 A NEW SCHEME FOR SATELLITE RAW DATA PROCESSING AND IMAGE REPRESENTATION 2.1 INTRODUCTION The Chapter-1 discusses the introduction and related work review of the research work. The overview

More information

Introduction to Remote Sensing. Electromagnetic Energy. Data From Wave Phenomena. Electromagnetic Radiation (EMR) Electromagnetic Energy

Introduction to Remote Sensing. Electromagnetic Energy. Data From Wave Phenomena. Electromagnetic Radiation (EMR) Electromagnetic Energy A Basic Introduction to Remote Sensing (RS) ~~~~~~~~~~ Rev. Ronald J. Wasowski, C.S.C. Associate Professor of Environmental Science University of Portland Portland, Oregon 1 September 2015 Introduction

More information

Study guide for Graduate Computer Vision

Study guide for Graduate Computer Vision Study guide for Graduate Computer Vision Erik G. Learned-Miller Department of Computer Science University of Massachusetts, Amherst Amherst, MA 01003 November 23, 2011 Abstract 1 1. Know Bayes rule. What

More information

ECC419 IMAGE PROCESSING

ECC419 IMAGE PROCESSING ECC419 IMAGE PROCESSING INTRODUCTION Image Processing Image processing is a subclass of signal processing concerned specifically with pictures. Digital Image Processing, process digital images by means

More information

Camera Requirements For Precision Agriculture

Camera Requirements For Precision Agriculture Camera Requirements For Precision Agriculture Radiometric analysis such as NDVI requires careful acquisition and handling of the imagery to provide reliable values. In this guide, we explain how Pix4Dmapper

More information

IKONOS High Resolution Multispectral Scanner Sensor Characteristics

IKONOS High Resolution Multispectral Scanner Sensor Characteristics High Spatial Resolution and Hyperspectral Scanners IKONOS High Resolution Multispectral Scanner Sensor Characteristics Launch Date View Angle Orbit 24 September 1999 Vandenberg Air Force Base, California,

More information

IMAGE ENHANCEMENT IN SPATIAL DOMAIN

IMAGE ENHANCEMENT IN SPATIAL DOMAIN A First Course in Machine Vision IMAGE ENHANCEMENT IN SPATIAL DOMAIN By: Ehsan Khoramshahi Definitions The principal objective of enhancement is to process an image so that the result is more suitable

More information

remote sensing? What are the remote sensing principles behind these Definition

remote sensing? What are the remote sensing principles behind these Definition Introduction to remote sensing: Content (1/2) Definition: photogrammetry and remote sensing (PRS) Radiation sources: solar radiation (passive optical RS) earth emission (passive microwave or thermal infrared

More information

Image Enhancement in Spatial Domain

Image Enhancement in Spatial Domain Image Enhancement in Spatial Domain 2 Image enhancement is a process, rather a preprocessing step, through which an original image is made suitable for a specific application. The application scenarios

More information

MULTISPECTRAL IMAGE PROCESSING I

MULTISPECTRAL IMAGE PROCESSING I TM1 TM2 337 TM3 TM4 TM5 TM6 Dr. Robert A. Schowengerdt TM7 Landsat Thematic Mapper (TM) multispectral images of desert and agriculture near Yuma, Arizona MULTISPECTRAL IMAGE PROCESSING I SENSORS Multispectral

More information

Abstract Quickbird Vs Aerial photos in identifying man-made objects

Abstract Quickbird Vs Aerial photos in identifying man-made objects Abstract Quickbird Vs Aerial s in identifying man-made objects Abdullah Mah abdullah.mah@aramco.com Remote Sensing Group, emap Division Integrated Solutions Services Department (ISSD) Saudi Aramco, Dhahran

More information

University of Texas at San Antonio EES 5053 Term Project CORRELATION BETWEEN NDVI AND SURFACE TEMPERATURES USING LANDSAT ETM + IMAGERY NEWFEL MAZARI

University of Texas at San Antonio EES 5053 Term Project CORRELATION BETWEEN NDVI AND SURFACE TEMPERATURES USING LANDSAT ETM + IMAGERY NEWFEL MAZARI University of Texas at San Antonio EES 5053 Term Project CORRELATION BETWEEN NDVI AND SURFACE TEMPERATURES USING LANDSAT ETM + IMAGERY NEWFEL MAZARI Introduction and Objectives The present study is a correlation

More information

Image Processing for feature extraction

Image Processing for feature extraction Image Processing for feature extraction 1 Outline Rationale for image pre-processing Gray-scale transformations Geometric transformations Local preprocessing Reading: Sonka et al 5.1, 5.2, 5.3 2 Image

More information

Digital Image Processing

Digital Image Processing Digital Image Processing Lecture # 5 Image Enhancement in Spatial Domain- I ALI JAVED Lecturer SOFTWARE ENGINEERING DEPARTMENT U.E.T TAXILA Email:: ali.javed@uettaxila.edu.pk Office Room #:: 7 Presentation

More information

Midterm Review. Image Processing CSE 166 Lecture 10

Midterm Review. Image Processing CSE 166 Lecture 10 Midterm Review Image Processing CSE 166 Lecture 10 Topics covered Image acquisition, geometric transformations, and image interpolation Intensity transformations Spatial filtering Fourier transform and

More information

Remote sensing in archaeology from optical to lidar. Krištof Oštir ModeLTER Scientific Research Centre of the Slovenian Academy of Sciences and Arts

Remote sensing in archaeology from optical to lidar. Krištof Oštir ModeLTER Scientific Research Centre of the Slovenian Academy of Sciences and Arts Remote sensing in archaeology from optical to lidar Krištof Oštir ModeLTER Scientific Research Centre of the Slovenian Academy of Sciences and Arts Introduction Optical remote sensing Systems Search for

More information

Chapters 1-3. Chapter 1: Introduction and applications of photogrammetry Chapter 2: Electro-magnetic radiation. Chapter 3: Basic optics

Chapters 1-3. Chapter 1: Introduction and applications of photogrammetry Chapter 2: Electro-magnetic radiation. Chapter 3: Basic optics Chapters 1-3 Chapter 1: Introduction and applications of photogrammetry Chapter 2: Electro-magnetic radiation Radiation sources Classification of remote sensing systems (passive & active) Electromagnetic

More information

Digital Image Processing. Lecture # 8 Color Processing

Digital Image Processing. Lecture # 8 Color Processing Digital Image Processing Lecture # 8 Color Processing 1 COLOR IMAGE PROCESSING COLOR IMAGE PROCESSING Color Importance Color is an excellent descriptor Suitable for object Identification and Extraction

More information

RGB colours: Display onscreen = RGB

RGB colours:  Display onscreen = RGB RGB colours: http://www.colorspire.com/rgb-color-wheel/ Display onscreen = RGB DIGITAL DATA and DISPLAY Myth: Most satellite images are not photos Photographs are also 'images', but digital images are

More information

Environmental Remote Sensing GEOG 2021

Environmental Remote Sensing GEOG 2021 Environmental Remote Sensing GEOG 2021 Lecture 2 Image display and enhancement 2 Image Display and Enhancement Purpose visual enhancement to aid interpretation enhancement for improvement of information

More information

746A27 Remote Sensing and GIS

746A27 Remote Sensing and GIS 746A27 Remote Sensing and GIS Lecture 1 Concepts of remote sensing and Basic principle of Photogrammetry Chandan Roy Guest Lecturer Department of Computer and Information Science Linköping University What

More information

REMOTE SENSING INTERPRETATION

REMOTE SENSING INTERPRETATION REMOTE SENSING INTERPRETATION Jan Clevers Centre for Geo-Information - WU Remote Sensing --> RS Sensor at a distance EARTH OBSERVATION EM energy Earth RS is a tool; one of the sources of information! 1

More information

Remote Sensing. Odyssey 7 Jun 2012 Benjamin Post

Remote Sensing. Odyssey 7 Jun 2012 Benjamin Post Remote Sensing Odyssey 7 Jun 2012 Benjamin Post Definitions Applications Physics Image Processing Classifiers Ancillary Data Data Sources Related Concepts Outline Big Picture Definitions Remote Sensing

More information

Microwave Remote Sensing (1)

Microwave Remote Sensing (1) Microwave Remote Sensing (1) Microwave sensing encompasses both active and passive forms of remote sensing. The microwave portion of the spectrum covers the range from approximately 1cm to 1m in wavelength.

More information

Center for Advanced Land Management Information Technologies (CALMIT), School of Natural Resources, University of Nebraska-Lincoln

Center for Advanced Land Management Information Technologies (CALMIT), School of Natural Resources, University of Nebraska-Lincoln Geoffrey M. Henebry, Andrés Viña, and Anatoly A. Gitelson Center for Advanced Land Management Information Technologies (CALMIT), School of Natural Resources, University of Nebraska-Lincoln Introduction

More information

Data Sources. The computer is used to assist the role of photointerpretation.

Data Sources. The computer is used to assist the role of photointerpretation. Data Sources Digital Image Data - Remote Sensing case: data of the earth's surface acquired from either aircraft or spacecraft platforms available in digital format; spatially the data is composed of discrete

More information

Non Linear Image Enhancement

Non Linear Image Enhancement Non Linear Image Enhancement SAIYAM TAKKAR Jaypee University of information technology, 2013 SIMANDEEP SINGH Jaypee University of information technology, 2013 Abstract An image enhancement algorithm based

More information

Sensors and Sensing Cameras and Camera Calibration

Sensors and Sensing Cameras and Camera Calibration Sensors and Sensing Cameras and Camera Calibration Todor Stoyanov Mobile Robotics and Olfaction Lab Center for Applied Autonomous Sensor Systems Örebro University, Sweden todor.stoyanov@oru.se 20.11.2014

More information

Remote Sensing for Rangeland Applications

Remote Sensing for Rangeland Applications Remote Sensing for Rangeland Applications Jay Angerer Ecological Training June 16, 2012 Remote Sensing The term "remote sensing," first used in the United States in the 1950s by Ms. Evelyn Pruitt of the

More information

Image Fusion. Pan Sharpening. Pan Sharpening. Pan Sharpening: ENVI. Multi-spectral and PAN. Magsud Mehdiyev Geoinfomatics Center, AIT

Image Fusion. Pan Sharpening. Pan Sharpening. Pan Sharpening: ENVI. Multi-spectral and PAN. Magsud Mehdiyev Geoinfomatics Center, AIT 1 Image Fusion Sensor Merging Magsud Mehdiyev Geoinfomatics Center, AIT Image Fusion is a combination of two or more different images to form a new image by using certain algorithms. ( Pohl et al 1998)

More information

Dirty REMOTE SENSING Lecture 3: First Steps in classifying Stuart Green Earthobservation.wordpress.com

Dirty REMOTE SENSING Lecture 3: First Steps in classifying Stuart Green Earthobservation.wordpress.com Dirty REMOTE SENSING Lecture 3: First Steps in classifying Stuart Green Earthobservation.wordpress.com Stuart.Green@Teagasc.ie You have your image, but is it any good? Is it full of cloud? Is it the right

More information

Remote Sensing Platforms

Remote Sensing Platforms Remote Sensing Platforms Remote Sensing Platforms - Introduction Allow observer and/or sensor to be above the target/phenomena of interest Two primary categories Aircraft Spacecraft Each type offers different

More information

EFFECT OF DEGRADATION ON MULTISPECTRAL SATELLITE IMAGE

EFFECT OF DEGRADATION ON MULTISPECTRAL SATELLITE IMAGE Journal of Al-Nahrain University Vol.11(), August, 008, pp.90-98 Science EFFECT OF DEGRADATION ON MULTISPECTRAL SATELLITE IMAGE * Salah A. Saleh, ** Nihad A. Karam, and ** Mohammed I. Abd Al-Majied * College

More information

Chapter 3 Part 2 Color image processing

Chapter 3 Part 2 Color image processing Chapter 3 Part 2 Color image processing Motivation Color fundamentals Color models Pseudocolor image processing Full-color image processing: Component-wise Vector-based Recent and current work Spring 2002

More information

Passive Microwave Sensors LIDAR Remote Sensing Laser Altimetry. 28 April 2003

Passive Microwave Sensors LIDAR Remote Sensing Laser Altimetry. 28 April 2003 Passive Microwave Sensors LIDAR Remote Sensing Laser Altimetry 28 April 2003 Outline Passive Microwave Radiometry Rayleigh-Jeans approximation Brightness temperature Emissivity and dielectric constant

More information

Remote Sensing. Ch. 3 Microwaves (Part 1 of 2)

Remote Sensing. Ch. 3 Microwaves (Part 1 of 2) Remote Sensing Ch. 3 Microwaves (Part 1 of 2) 3.1 Introduction 3.2 Radar Basics 3.3 Viewing Geometry and Spatial Resolution 3.4 Radar Image Distortions 3.1 Introduction Microwave (1cm to 1m in wavelength)

More information

An Efficient Color Image Segmentation using Edge Detection and Thresholding Methods

An Efficient Color Image Segmentation using Edge Detection and Thresholding Methods 19 An Efficient Color Image Segmentation using Edge Detection and Thresholding Methods T.Arunachalam* Post Graduate Student, P.G. Dept. of Computer Science, Govt Arts College, Melur - 625 106 Email-Arunac682@gmail.com

More information

Lab 6: Multispectral Image Processing Using Band Ratios

Lab 6: Multispectral Image Processing Using Band Ratios Lab 6: Multispectral Image Processing Using Band Ratios due Dec. 11, 2017 Goals: 1. To learn about the spectral characteristics of vegetation and geologic materials. 2. To experiment with vegetation indices

More information

Digital Image Processing - A Remote Sensing Perspective

Digital Image Processing - A Remote Sensing Perspective ISSN 2278 0211 (Online) Digital Image Processing - A Remote Sensing Perspective D.Sarala Department of Physics & Electronics St. Ann s College for Women, Mehdipatnam, Hyderabad, India Sunita Jacob Head,

More information

Lecture 13: Remotely Sensed Geospatial Data

Lecture 13: Remotely Sensed Geospatial Data Lecture 13: Remotely Sensed Geospatial Data A. The Electromagnetic Spectrum: The electromagnetic spectrum (Figure 1) indicates the different forms of radiation (or simply stated light) emitted by nature.

More information

Comprehensive Vicarious Calibration and Characterization of a Small Satellite Constellation Using the Specular Array Calibration (SPARC) Method

Comprehensive Vicarious Calibration and Characterization of a Small Satellite Constellation Using the Specular Array Calibration (SPARC) Method This document does not contain technology or Technical Data controlled under either the U.S. International Traffic in Arms Regulations or the U.S. Export Administration Regulations. Comprehensive Vicarious

More information

RADAR (RAdio Detection And Ranging)

RADAR (RAdio Detection And Ranging) RADAR (RAdio Detection And Ranging) CLASSIFICATION OF NONPHOTOGRAPHIC REMOTE SENSORS PASSIVE ACTIVE DIGITAL CAMERA THERMAL (e.g. TIMS) VIDEO CAMERA MULTI- SPECTRAL SCANNERS VISIBLE & NIR MICROWAVE Real

More information

Spectral Signatures. Vegetation. 40 Soil. Water WAVELENGTH (microns)

Spectral Signatures. Vegetation. 40 Soil. Water WAVELENGTH (microns) Spectral Signatures % REFLECTANCE VISIBLE NEAR INFRARED Vegetation Soil Water.5. WAVELENGTH (microns). Spectral Reflectance of Urban Materials 5 Parking Lot 5 (5=5%) Reflectance 5 5 5 5 5 Wavelength (nm)

More information

typical spectral signatures of photosynthetically active and non-photosynthetically active vegetation (Beeri et al., 2007)

typical spectral signatures of photosynthetically active and non-photosynthetically active vegetation (Beeri et al., 2007) typical spectral signatures of photosynthetically active and non-photosynthetically active vegetation (Beeri et al., 2007) Xie, Y. et al. J Plant Ecol 2008 1:9-23; doi:10.1093/jpe/rtm005 Copyright restrictions

More information

Prof. Vidya Manian Dept. of Electrical and Comptuer Engineering

Prof. Vidya Manian Dept. of Electrical and Comptuer Engineering Image Processing Intensity Transformations Chapter 3 Prof. Vidya Manian Dept. of Electrical and Comptuer Engineering INEL 5327 ECE, UPRM Intensity Transformations 1 Overview Background Basic intensity

More information

Image Processing Lecture 4

Image Processing Lecture 4 Image Enhancement Image enhancement aims to process an image so that the output image is more suitable than the original. It is used to solve some computer imaging problems, or to improve image quality.

More information

DECISION NUMBER FOURTEEN TO THE TREATY ON OPEN SKIES

DECISION NUMBER FOURTEEN TO THE TREATY ON OPEN SKIES DECISION NUMBER FOURTEEN TO THE TREATY ON OPEN SKIES OSCC.DEC 14 12 October 1994 METHODOLOGY FOR CALCULATING THE MINIMUM HEIGHT ABOVE GROUND LEVEL AT WHICH EACH VIDEO CAMERA WITH REAL TIME DISPLAY INSTALLED

More information

Introduction to Remote Sensing Part 1

Introduction to Remote Sensing Part 1 Introduction to Remote Sensing Part 1 A Primer on Electromagnetic Radiation Digital, Multi-Spectral Imagery The 4 Resolutions Displaying Images Corrections and Enhancements Passive vs. Active Sensors Radar

More information

Image Filtering. Median Filtering

Image Filtering. Median Filtering Image Filtering Image filtering is used to: Remove noise Sharpen contrast Highlight contours Detect edges Other uses? Image filters can be classified as linear or nonlinear. Linear filters are also know

More information

TSBB09 Image Sensors 2018-HT2. Image Formation Part 1

TSBB09 Image Sensors 2018-HT2. Image Formation Part 1 TSBB09 Image Sensors 2018-HT2 Image Formation Part 1 Basic physics Electromagnetic radiation consists of electromagnetic waves With energy That propagate through space The waves consist of transversal

More information

DIGITAL IMAGE PROCESSING (COM-3371) Week 2 - January 14, 2002

DIGITAL IMAGE PROCESSING (COM-3371) Week 2 - January 14, 2002 DIGITAL IMAGE PROCESSING (COM-3371) Week 2 - January 14, 22 Topics: Human eye Visual phenomena Simple image model Image enhancement Point processes Histogram Lookup tables Contrast compression and stretching

More information

LAB MANUAL SUBJECT: IMAGE PROCESSING BE (COMPUTER) SEM VII

LAB MANUAL SUBJECT: IMAGE PROCESSING BE (COMPUTER) SEM VII LAB MANUAL SUBJECT: IMAGE PROCESSING BE (COMPUTER) SEM VII IMAGE PROCESSING INDEX CLASS: B.E(COMPUTER) SR. NO SEMESTER:VII TITLE OF THE EXPERIMENT. 1 Point processing in spatial domain a. Negation of an

More information

Background. Computer Vision & Digital Image Processing. Improved Bartlane transmitted image. Example Bartlane transmitted image

Background. Computer Vision & Digital Image Processing. Improved Bartlane transmitted image. Example Bartlane transmitted image Background Computer Vision & Digital Image Processing Introduction to Digital Image Processing Interest comes from two primary backgrounds Improvement of pictorial information for human perception How

More information

Chapters 1-3. Chapter 1: Introduction and applications of photogrammetry Chapter 2: Electro-magnetic radiation. Chapter 3: Basic optics

Chapters 1-3. Chapter 1: Introduction and applications of photogrammetry Chapter 2: Electro-magnetic radiation. Chapter 3: Basic optics Chapters 1-3 Chapter 1: Introduction and applications of photogrammetry Chapter 2: Electro-magnetic radiation Radiation sources Classification of remote sensing systems (passive & active) Electromagnetic

More information

John P. Stevens HS: Remote Sensing Test

John P. Stevens HS: Remote Sensing Test Name(s): Date: Team name: John P. Stevens HS: Remote Sensing Test 1 Scoring: Part I - /18 Part II - /40 Part III - /16 Part IV - /14 Part V - /93 Total: /181 2 I. History (3 pts. each) 1. What is the name

More information

NAVAL POSTGRADUATE SCHOOL THESIS

NAVAL POSTGRADUATE SCHOOL THESIS NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS PRINCIPAL COMPONENTS BASED TECHNIQUES FOR HYPERSPECTRAL IMAGE DATA by Leonidas Fountanas December 2004 Thesis Advisor: Second Reader: Christopher Olsen

More information