Introduction to remotely sensed data Ms Barbara Harrison & Dr David L B Jupp CSIRO Division of Water Resources

Size: px
Start display at page:

Download "Introduction to remotely sensed data Ms Barbara Harrison & Dr David L B Jupp CSIRO Division of Water Resources"

Transcription

1 Introduction to remotely sensed data Ms Barbara Harrison & Dr David L B Jupp CSIRO Division of Water Resources Reproduced courtesy of the authors and CSIRO Information Management, from the CSIRO Publication: "Introduction to Remotely Sensed Data Part ONE of the microbrian Resource Manual". This module is an electronic version of the above manual, however, the text has not been altered to reflect the new format. It is part of a series, as outlined below. The printed manuals are available from: CSIRO Publications 314 Albert St East Melbourne Vic 3002 Australia The microbrian Resource Manual series consolidates reference material from a wide range of sources relating to interpretation and processing of remotely sensed data. The series consists of five publications: Introduction to Remotely Sensed Data Introduction to Image Processing Image Classification and Analysis Image Rectification and Registration Spatial Data Integration. Part ONE, Introduction to Remotely Sensed Data provides a basic introduction to remote sensing and is intended for users with little or no previous experience with such data. An understanding of the origin of remotely sensed data is essential for its accurate interpretation. Parts TWO to FIVE describe commonly available image processing options and how these may be used for a variety of remote sensing applications. Part TWO, Introduction to Image Processing, introduces a wide range of image processing techniques using microbrian, while Parts THREE to FIVE cover the more advanced processing procedures. Part THREE, Image Classification and Analysis, explains different approaches to the classification of remotely sensed imagery. A methodology for classification which uses both spectral and spatial characteristics of the image is implemented in microbrian and has been successfully applied to a wide range of applications. Part FOUR, Image Rectification and Registration, details the procedure for rectifying and registering imagery, that is, establishing a model between the geometry of an image and that of another image or map. The rectification methodology implicit in microbrian involves the use of sophisticated models for known geometric distortions in a range of satellite and scanner image systems and conversion between a range of map co-ordinate systems. Part FIVE, Spatial Data Integration, describes the creation of ancillary data planes and their integration with remotely sensed imagery. Its procedures provide an interface between image processing systems such as microbrian and Geographic Information Systems and require an understanding of image rectification as defined in Part FOUR. 1

2 The microbrian Resource Manual series provides a comprehensive overview of remote sensing and image processing techniques for both students and professionals in related subject areas, as well as detailed algorithmic descriptions for users of the microbrian image processing system. 2

3 3

4 Acknowledgements 1. Introduction 1.1 Definition 1.2 Data Sources 1.3 Electromagnetic Energy 1.4 Colour 2. Data Acquisition 2.1 Recording Methods Photographic Digital 2.2 Sensing Devices Multi-spectral scanners Spectral regions Scanning operation a. Electromechanical b. Linear array c. Central perspective Microwave sensors Radar a. Side-Looking Airborne Radar (SLAR) b. Synthetic Aperture SLAR (SAR) c. Radar altimeter d. Microwave scatterometer Microwave radiometer 3. Platforms and Sensor Systems 3.1 Aircraft 3.2 Spacecraft Manned Unmanned 4. Data Interpretation 4.1 Multi-spectral 4.2 Microwave 5. Data Resolution and Extent 5.1 Spectral 5.2 Spatial 5.3 Radiometric 5.4 Temporal 6. Applications 6.1 Overview 6.2 Verification of Analyses 6.3 Geographic Information Systems 6.4 Image Processing of Remotely Sensed Data References Ms Barbara Harrison & Dr David L B Jupp, Introduction to remotely sensed data Contents 4

5 Acknowledgements Many individuals and organisations have contributed to the production of this series of manuals. The microbrian image processing system is a product of the labour of numerous scientists, programmers and users since Among these contributors, the computing talents of Kevin Mayo have been particularly significant. The authors thank Dr Henry Barrs, Dr Brian Button and Jeff Kingwell for their careful refereeing of the draft manuscript for Part ONE, and their encouragement to pursue its wider distribution through publication. Dr Peter Harrison gave essential assistance and encouragement during the production of numerous draft documents. Guy Byrne and Ann Hosseen patiently proof-read the manuscript and suggested useful clarifications. Many microbrian users also provided valuable comments on the scope and accuracy of the draft manual. Divisional editorial support was ably provided by Margaret Lowe with layout and publishing being undertaken by Kevin Jeans and John Best of CSIRO Editorial Services. Expert drafting skills were painstakingly provided by Ninon Geier and Heinz Buettikofer. The careful design of illustrative material contributes significantly to the overall presentation and their talents and patience in this area are very much appreciated. Many thanks are also due to Narelle Dittmer for accurate typing of the draft glossary. Landsat imagery were provided by the Australian Centre for Remote Sensing (Australian Surveying and Land Information Group, Dept. of Administrative Services) and AMIRA. Jenny Weissel was very helpful in advising on and arranging production of colourwrite images used in this text. SPOT imagery (used in Figure 34c) is copyright CNES, AVHRR coverage maps (Figure 27) were kindly provided at short notice by Paul Tildesley. Sources of figure, table and glossary information are indicated in the text. The Australian reference material contained in Appendices 1 ~ 4 was obtained directly from relevant organisations and their co-operation in its verification is appreciated. Every effort was made to ensure currency of this information at the time of printing. The authors appreciate the support of colleagues at CSIRO Division of Water Resources during manuscript preparation, particularly Kevin Mayo, Paul Hutton and Guy Byrne who shouldered the workload left undone by the demands of publishing, and co-workers at MPA International Pty Ltd. Neil Body also took a keen interest in this project and was instrumental in allowing the manual to take its final form. Finally, we thank the microbrian user community for its continued enthusiastic support for the microbrian image processing system. 5

6 1.1 Definition Ms Barbara Harrison & Dr David L B Jupp, Introduction to remotely sensed data 1. Introduction Remote sensing is a general term which describes the action of obtaining information about an object with a sensor which is physically separated from the object. Such sensors rely upon the detection of energy emitted from or reflected by the object. Two common examples of remote sensing are human vision, which relies on the detection of reflected light, and sonar, which detects sound waves. In the context of microbrian, remote sensing is primarily concerned with deriving information about the Earth's surface using an elevated platform. Remotely sensed data provide an ideal 'view' of the Earth for various resource inventory and monitoring studies. As such, remote sensing has saved considerable effort and cost in terms of traditional surveying methods and provides a consistent base for extrapolation and interpolation of ground reconnaissance data. 1.2 Data Sources Remote sensing devices may be designed to detect various types of energy, such as electromagnetic radiation, gravity, magnetism and sound waves. Electromagnetic energy (described in Section 1.3) is the energy form most commonly sensed by earth remote sensing devices. The source of the radiation being sensed may or may not be independent of the sensing device. Active remote sensing devices, such as radar, direct radiation of a particular form towards an object and then detect the amount of that energy which is radiated by the object. These active remote sensing systems operate in the microwave and radio wave regions of the EM spectrum. Lidar (Laser Imaging radar) systems are active remote sensors which operate in the ultraviolet, visible and near infrared wavelengths. Passive remote sensing relies on the radiation originating from some other source, principally the sun. Reflected solar energy is detected by passive remote sensing devices in the visible, near infrared and middle infrared regions while the Earth's emitted energy may be detected in the middle infrared and thermal infrared wavelengths. Certain microwave sensors are also in the passive detector category. Aerial photography and Landsat satellite imagery are examples of data collected by passive remote sensing systems. Remotely sensed data may be recorded in either photographic or digital (numeric) form. Data must be in digital form for computer processing so any data which was recorded photographically, such as aerial photography, need to be converted to digital form before being used in image processing systems such as microbrian. Most satellite image data and airborne scanner imagery are initially recorded in digital form. Section 2 describes recording methods, in terms of both photographic and digital data, in more detail. Remote sensing devices may be carried on a variety of platforms, as discussed in Section 3. Characteristics of both the platform and sensing device determine the resolution of the remotely sensed data. Section 5 discusses the spectral, spatial, radiometric and temporal aspects of data resolution and extent. Data interpretation is considered in Section 4 while applications of remote sensing are briefly overviewed in Section 6. 6

7 1.3 Electromagnetic Energy The sun provides most of the energy which we sense as light. This energy consists of electromagnetic (EM) waves which travel in harmonic, sinusoidal motion as shown in Figure 1. Figure 1: An electromagnetic wave comprises two sinusoidal waves: an electric wave (E) and a magnetic wave (H). Each component wave is perpendicular to the other and also to the direction of travel. Wavelength (lambda) is measured as the distance between successive wave peaks or troughs. Frequency (n) is measured as the number of cycles per second passing a fixed point. Although all EM radiation (EMR) travels at the same speed (3 x 10^8 m sec-1), the wavelengths (that is, the distance between consecutive troughs or crests) of the waves may vary. The resulting range of wavelengths gives rise to the Electromagnetic Spectrum, illustrated in Figure 2. The high energy forms of EMR, such as X-rays, have short wavelengths and high frequency (since the shorter the distance per cycle, the greater the number of cycles required to achieve the same speed), while low energy forms, such as radio waves, have long wavelengths and low frequency. Figure 2: The electromagnetic spectrum. Names are assigned to regions of the spectrum for convenience there are no clearly defined divisions. Note: scales are logarithmic. The speed of propagation of EMR is commonly known as the speed of light; 'light' being that small part of the continuous EM spectrum to which the human eye is sensitive. This visible portion of the spectrum includes wavelengths from 0.4 ~ 0.7 µm (10-6 m). For convenience, names are assigned to the various regions of the spectrum, but there are no clear-cut dividing lines. The regions are generally defined by the sensing method by which they are detected. Electromagnetic energy is continuously emitted at all wavelengths by every material with a temperature above absolute zero ( C or 0 K). With no other objects in the universe, a material would gradually cool to 0 K by radiating all of its energy. Absorption of energy increases both the temperature and rate of emission of a material. If the material is 'black' in that it absorbs all radiation that reaches it (a perfect absorber is referred to as a 'blackbody'), then the spectral composition and intensity of emission are well defined and follow Planck's laws as shown in Figure 3a. 7

8 Figure 3 (a): Blackbody radiation. Energy emitted at different temperatures based on a square metre of blackbody. Note: scales are logarithmic. The sun is not a blackbody but its radiation is similar to that of a blackbody at approximately 6000K with peak intensity being in the spectral region we detect as visible light. Solar radiation is dispersed as it travels from the sun to the Earth so that reduced energy levels are detected at the Earth compared with the theoretical curve as indicated in Figure 3b. The Earth is not a blackbody either, but it does emit radiation in a similar way to a blackbody at 300K. This temperature is the result of various interactions between the Earth's surface and the atmosphere and is responding to all the Earth's processes which produce heat. This radiation is the Earth's thermal radiation. Figure 3(b): Blackbody radiation. Solar and terrestrial energy spectra: (i) estimated solar radiation on top of the Earth's atmosphere based on a blackbody at 5700 degrees C (ii) approximate energy emitted by Earth's surface based on a blackbody at 27 degees C. Note: scales are logarithmic. Because the Earth, and its surface materials, are not black, they do not absorb all the sun's radiation but reflect and scatter radiation as well. The spectral composition of radiation from the Earth therefore consists of both reflected 8

9 and emitted components. The intensity of radiation emitted from Earth, when viewed from space, is greatest where the sun's radiation is greatest (that is, in the visible region) due to the reflectance of solar energy. A secondary maximum occurs at the peak of the Earth's thermal emission spectrum corresponding to the thermal infrared region. Because the temperatures of the sun and the Earth are so different, these processes are often considered separately as 'short wave reflection' (reflection of the sun's radiation) and 'long wave emission' (emission of radiation by the Earth) and have little overlap; that is, there is very little reflected solar radiation in the thermal infrared region and very little emitted radiation in the visible region. Much of the longer wavelength energy cannot be seen or photographed but can be sensed with radiometers and scanners. The range of wavelengths in which various sensors can operate is shown in Figure 4. Figure 4: Common remote sensing systems. Scanners and radiometers can operate outside the visible and photographic wavelength regions. Note: scale is logarithmic. The visible region can be sub-divided into component colours ranging from the higher energy, shorter wavelengths of violet, indigo and blue, through those of green and yellow, to the lower energy wavelengths of orange and red. What is known as 'white' light, such as sunlight, is actually a mixture of all these wavelengths. The feature we describe as the colour of an object is the energy not absorbed by that object. In the case of a green leaf, for example, the blue and red wavelengths of light are absorbed by the chloroplasts and used as an energy source for photosynthesis, but a greater proportion of the green wavelengths are reflected (see Figure 5). In this case our eyes detect this reflectance as the colour green. By a learned association of the colour green with living foliage, combined with other features such as size and shape, we identify the object and its state (living and dead). The same principle forms the basis of identifying features in remote sensing. 9

10 Figure 5: Reflectance characteristics of typical green leaf structure. Chloroplasts reflect green wavelengths but absorb blue and red wavelengths for use in photosynthesis. Spongy mesophyll cells strongly reflect infrared wavelengths. [Adapted from Barrett and Curtis 1976] Different land cover features have varying, and often characteristic, reflectance in the non-visible wavelengths (such as infrared or thermal infrared). Consequently, 'artificial' sensors, which are capable of detecting these waves, give us more information than what we know as colour with which to identify an object. Figure 6 illustrates this for three major land cover features (vegetation, water and bare soil). Figure 6: Spectral reflectance curves for two types of green vegetation, light and dark soils, and clear and turbid water. The interaction of incoming radiation with surface features depends on both the spectral reflectance properties of the surface materials and the surface smoothness relative to the radiation wavelength. A relatively 'smooth' surface which reflects energy without any scattering (that is, the angle of incidence equals the angle of reflection) is called a 'specular' or 'mirror' reflector. 'Diffuse' or 'Lambertian' reflectance occurs when the surface is rough relative to the wavelength(s) of the incoming radiation and causes the energy to be reflected equally in all directions. These surface reflection types are illustrated in Figure 7. Calm water can act like a specular reflector. Most of the Earth's other surface materials are diffuse reflectors in the visible wavelengths but may display specular reflectance properties in the microwave wavelengths. Surface orientation relative to both the energy source and sensing device also affects the amount of radiation detected by the sensor. 10

11 Figure 7: Specular and diffuse reflectors. (a) specular reflection from a smooth surface (angle of incidence equals angle of reflection ). (b) diffuse reflection from a rough surface Various aspects of the interaction of electromagnetic radiation with the Earth's surface are discussed in Section 4 and are well reviewed by Swain and Davis (1978), Slater (1980), Colwell (1983) and Curran (1985). Certain regions of the EM spectrum are completely absorbed by the various gases that make up the atmosphere, so that wavelengths in these regions cannot be used for remote sensing of the Earth's surface. The regions of the EM spectrum which are not affected by the Earth's atmosphere are called 'atmospheric windows' and are shown in Figure 8. Remote sensing data of the Earth's surface can only be obtained from systems operating within these bounds. These windows allow detection of reflected solar energy from the Earth's surface in the visible and infrared wavelengths, Earth's emitted energy in two regions of the thermal infrared, and radar or passive microwave in the longer wavelengths. Within these windows, atmospheric conditions such as haze, fog and clouds can still affect remote sensing of features on the Earth's surface through scattering of the EM waves by particles in the atmosphere. When these particles are much smaller than the wavelength of the radiation, the scattering effect is termed 'Rayleigh scattering'. This effect is most severe on the shorter wavelengths and is the reason that the sky appears blue (the blue light is scattered to such an extent by the atmosphere that it appears to reach our eyes from all directions) and sunsets appear red (the sun's rays follow a longer path through the atmosphere from the horizon during which the shorter wavelengths are much scattered, leaving only the longer, orange and red, wavelengths for us to see). Scattered light reaching the Earth is referred to as diffuse, rather than direct, radiation or 'skylight'. The radiation which has been scattered by the atmosphere and reaches the remote sensing device without contacting the Earth's surface is termed 'atmospheric path radiance'. This form of scattering typically reduces the radiometric extent of radiance measurements in the blue region of the electromagnetic spectrum (see Section 5.3). Figure 8: Atmospheric transmittance. Some wavelength regions of the EM spectrum are absorbed by atmospheric gases so cannot be used for remote sensing of the Earth's surface features. Note: scale is logarithmic A scattering mechanism, referred to as 'Mie scattering', occurs when the particle size is comparable to the radiation wavelength. Particles of smoke, dust, salt and water can cause this type of scattering. For example, conditions of heavy atmospheric haze produce mie scattering with visible and infrared wavelengths, while rain can result in mie scattering of microwaves. 11

12 'Non-selective scattering' is produced by water droplets, ice fragments or large dust particles which are larger than the radiation wavelength. Such particles directly reflect any incident radiation. When the particles are sufficiently dense they act as an opaque layer between the sensor and the Earth's surface. Cloudy and dust-laden atmospheres cause non-selective scattering in the visible and infrared wavelengths and thus reduce the effectiveness of data in these spectral regions for remote sensing of the Earth's surface. Fog can have a similar effect on thermal infrared wavelengths. Microwaves have much longer wavelengths so are usually only affected by very heavy rain where the size and density of water droplets is much higher. These various forms of scattering increase the haze level or reduce the contrast in an image. Image processing methods can be applied to data obtained under such conditions to remove or reduce the effect of atmospheric 'noise' as further discussed in Part TWO, Introduction to Image Processing. Slater (1980) and Teillet (1986) provide comprehensive reviews of the causes of, and processing methods for the reduction of, atmospheric effects in remotely sensed imagery. The experiences of the Canada Centre for Remote Sensing in radiometric correction of satellite and airborne remote sensing data are reviewed in Ahern et al. (1987). Remote sensing studies of the atmosphere itself, however, use absorption and scattering characteristics to measure atmospheric state and composition. In these studies the absorption bands, rather than windows, are preferred and scattering effects indicate atmospheric condition. 12

13 1.4 Colour Ms Barbara Harrison & Dr David L B Jupp, Introduction to remotely sensed data The human sensation of colour is considered to be due to the sensitivities of three types of neurochemical sensors (which are present in the cones of the retina) to different wavelengths in the visible region of the EM spectrum. Each sensor is associated with one type of cone and responds to a range of wavelengths, with varying sensitivity as shown in Figure 9. One type of sensor is maximally sensitive to short wavelengths with a peak response at approximately 0.44 µm. This is often referred to as the blue sensor and is insensitive to wavelengths longer than 0.52 µm. The second sensor has peak sensitivity at 0.53 µm, or green light. The third is referred to as the red sensor although peak sensitivity actually occurs at 0.57 µm, which is the wavelength of yellow light. However, of the three, this third sensor still has the highest absorption of red light and its sensitivity was probably 'designed' to avoid responding to near infrared wavelengths. (It has been suggested that visual sensitivity to near infrared radiation would be undesirable in vegetated landscapes!) Figure 9: Sensitivities of human vision sensors: (a) Blue cone absorption (b) Green cone absorption (c) Red cone absorption. [Adapted from Cornsweet 1970]. The human colour response is determined by the ratio of the neural activities of these three sensors. This ratio changes with the wavelength of the perceived light. For example, a single wavelength light at 0.45 µm produces a strong response from the blue sensor, a weaker response from the green sensor and a still weaker response from the red sensor. Similarly, red radiation produces strongest response from the red sensor and much weaker responses from the other two sensors. Yellow light will also invoke a strong response from the red sensor but the green and blue responses will be stronger than for red light, hence allowing the two colours to be differentiated. However, our final perception of the colour of an object is influenced by the ambient light intensity, its size and proximity to other objects and the peculiar sensitivities of each human eye. Because of this relationship between visual sensitivity and wavelength, it is difficult to classify colours in terms of brightness. Luminosity, a scale on which the energy of light is corrected for the human eye's sensitivities, is used as a rough approximation of actual perceived brightness (Padgham and Saunders 1975). Although the eye has a remarkable capacity for detecting light, it is not a good discriminator in terms of perceived brightness. Drury (1987) reports that for objects with visual angles which are similar to those typically found in remotely sensed imagery, only 20 ~ 30 different brightness steps are discernible. Perception of colour is a different matter, however, with hundreds of thousands, or even millions, of colours being distinguishable to the human eye. These human factors have considerable relevance to visual interpretation of remotely sensed imagery. Colour spaces, or models, attempt to describe the colours perceived by human beings. The three peak sensitivity wavelengths described above, which are referred to as blue, green and red, form a natural basis or co-ordinate 13

14 system for describing colour measurement. It should be remembered however, that human colour perception is not implicitly related to the physical measurement of spectral colour. In colour science, these peak-sensitivity wavelengths are referred to as the additive primaries, since they add together to give white light, or the RGB (red-green-blue) system. As shown in Figure 10, the brighter, secondary colours may be formed by combinations of pairs of these primaries. In this system, red and green combine to produce yellow blue and red combine to produce magenta green and blue combine to produce cyan. Various colour representation devices such as projectors and display monitors utilise this additive property of radiation. Colour display monitors use red, green and blue phosphors, which are excited by three separately controlled electron guns, to achieve a mixture of coloured light at the monitor's surface. Figure 10: Additive colour mixing. Red, green and blue primaries are added to a black background to give other colours. Figure 11: Subtractive colour mixing. Yellow, magenta and cyan primaries subtract from a white background to produce other colours. Colour mixing for film dyes and print media uses the subtractive primaries of yellow, magenta and cyan (YMC). These coloured pigments are mixed to produce darker colours, that is, they subtract from white. In the subtractive colour system, magenta and cyan combine to produce blue yellow and cyan combine to produce green yellow and magenta combine to produce red 14

15 As illustrated in Figure 11, these subtractive primaries produce secondary colours by removing their complementary colours from white. The relationship between additive and subtractive primaries is illustrated in Figure 12. Figure 12: Relationship between additive and subtractive primary colours. The use of colour in image processing, interpretation and presentation is further discussed in Part Two, Introduction to Image Processing. 15

16 2. Data Acquisition 2.1 Recording Methods Remote sensing devices may record data directly onto photographic film or electronically code radiation in numeric form to produce a digital image. Photographic recording of data simply requires standard film processing to obtain an image for human interpretation, but such data need to be digitised before being processed by a computer. A digitally recorded image however, is already in a computer compatible format but can only be viewed after some processing by a computer Photographic Remotely sensed photographic data are produced by directly recording the radiation from an object onto photographic film. The range of wavelengths which may be detected by photographic devices is limited by the sensitivities of the film and filter(s) being used in the camera. The spectral sensitivity of photographic film can range from ultraviolet to near infrared wavelengths. Filters may be used in conjunction with different film types to restrict the wavelengths being recorded or reduce the effect of atmospheric haze. Multi-band cameras, which simultaneously record multiple photographic impressions of an object, may be used to simulate a multi-spectral image. Such cameras use varying film and filter combinations to record different spectral regions in each photograph. Photographic film may be sensitive to wavelengths over a single range (such as single layer black and white film) or in three wavelength ranges (such as three layer colour film). The four types of film which may be used for aerial photography are: black and white visible, or 'panchromatic' black and white near infrared colour visible false colour near infrared Panchromatic film can be of two types: 'mapping' film which has roughly equal sensitivity to all visible wavelengths. This is the most commonly used film for aerial photography. It may be used in conjunction with a 'minus-blue' (blue absorbing) filter to reduce the effects of atmospheric scattering and haze in the blue end of the EM spectrum. 'reconnaissance' film which is less sensitive in the blue wavelength region to reduce the effect of atmospheric scattering. Panchromatic film is sensitive to wavelengths in the range 0.3 ~ 0.7 µm (see Figure 13). The spectral sensitivity of black and white infrared film extends to about 0.9 µm in the near infrared region (see Figure 13). It can be used with a dark red absorption filter to record near infrared radiation only or with appropriate filters to record selected regions in both visible and near infrared wavelengths. 16

17 Figure 13: Spectral sensitivities of black and white visible ('panchromatic') and infrared films. [Adapted from Drury 1987] As colour film is designed to mimic human vision, its three emulsion layers are sensitive to the three wavelength regions of blue, green and red. Colour infrared film has different sensitivity to normal colour film and is used in conjunction with a yellow (blue absorbing) filter to record green, red and infrared radiation onto its three emulsion layers. This relationship produces false colour imagery with characteristic colours for different land covers: green vegetation appears red; deep clear water is dark blue; turbid water is bright blue; red soil appears green and bright urban areas appear pale blue. The concepts of colour and colour mixing have been introduced in Section 1.4 and are further discussed in Part Two. The sensitivities of colour and colour infrared films are illustrated in Figure 14. Figure 14: Spectral sensitivities of colour and colour infrared films. [Adapted from Drury 1987] Remote sensing devices which record data photographically require that the film be recoverable for processing. Such devices can be carried by aircraft or retrievable spacecraft (such as the Space Shuttle). Examples of data collected this way are aerial photography, the Large Format Space Camera imagery and Shuttle Imaging Radar scenes (see Section 3). The scale of aerial photography may be determined by dividing the focal length of the camera by the vertical height of the lens above ground level. If either of these parameters is unknown, the scale may also be computed by dividing the distance between two points on the photograph by the ground distance separating them. In either case, the units used to measure these parameter pairs must be consistent. Advantages of photographic imagery are the technical simplicity of its processing and interpretation. Both these factors tend to mean that it is available at a lower cost than digitally recorded data. However, unlike digital scanners, photographic devices can only directly detect radiation in the visible and near infrared range of the electromagnetic spectrum so such data are affected by cloud cover. Photographic imagery may also be produced for other wavelength regions using film recording techniques after the radiation has been initially recorded by appropriate sensors. The range of processing options available for photographic imagery (without digitising) is greatly reduced compared with the range for digital imagery. Also, the interpretation of photographic imagery relies heavily on the 17

18 skills and time of a trained human interpreter. Photographic data may, however, be converted into a digital image via the digitising process described below Digital Ms Barbara Harrison & Dr David L B Jupp, Introduction to remotely sensed data Data need to be in digital, or numeric, form to be processed by a computer. In a digital image, colours are represented by numbers. A grid (or some other systematic tessellation) pattern is used to record the colours in the image, each cell being assigned one or more colour numbers. Digital scanners directly record radiation in numeric format. In most cases, a zero value indicates no (or the minimum detectable) radiation level in a cell and the maximum recordable number indicates the maximum detectable radiation level. For much remotely sensed imagery the maximum detectable radiation level is recorded as 255. This results in a range of 256 possible brightness levels in the digital image. Any picture, photograph or map can be digitised. Automatic scanning devices, which operate in a similar manner to the satellite scanning systems, can be used in a laboratory to convert coloured or black and white maps, pictures or photographs, into digital images for processing by computer. The conversion of an image from pictorial to 'raster' format is called scan digitising (as opposed to line digitising), and involves two basic processes: a. Sub-divide the image using a grid (or some other systematic tessellation) pattern into small cells called picture elements or pixels. Obviously, a finer digitising grid will produce a digital image with greater spatial detail but also with a larger number of pixels. b. Assign a single numeric value to each pixel to represent its overall brightness level. When a number has been assigned to every pixel, the image is represented by a two dimensional integer array as shown in Figure 15. Figure 15: A digital image. The 'colour' of each grid cell is represented by a number. A colour image is scanned into three channels, with each channel representing one of the primary colours: blue, green and red. It is possible to recombine several (usually three) registered image channels through a colour additive technique (detailed in Part TWO) to generate a colour composite image. The specific colours thus regenerated depend on the grey level patterns in each individual image channel and on the colour filters used in the additive process. Those colour renditions that simulate the original scene colours are called natural colour images; those consisting of significantly different colours to the original scene are referred to as false colour images. The digital image and its processing is further discussed in Part TWO, Introduction to Image Processing. While the scan digitising process is usually automated, with the grid size being quite small and very subtle changes in colour being recorded as different digital values, the process can be simulated manually for simplified maps. Part TWO details the ways gridded or raster data may be input to the microbrian system. Boundary information may also be line digitised so that each boundary is represented by a string of co-ordinate 18

19 locations or 'vectors'. The microbrian system allows data in this format to be converted to raster format. This process is described in detail in Part FIVE, Spatial Data Integration. 19

20 2.2 Sensing Devices Harris (1987) and Richards (1986) review the characteristics of currently available remote sensing imagery from satellite-borne sensing devices. Richards (1986) also details the available aircraft-borne sensors. Curran (1985) describes the use and features of aerial photography and both aerial scanner and satellite scanner imagery. Lillesand and Kiefer (1979) detail various aspects of remotely sensed image acquisition and interpretation, with particular emphasis on aerial photograph data. Section 3 summarises the platforms most commonly used to carry remote sensing devices and briefly describes some of the sensor systems currently in operation. Aspects of interpreting data from these devices are considered in Section Multi-spectral scanners Spectral regions: Multi-spectral scanners (MSS), as the name implies, are a particular class of remote sensing device which sense radiation in multiple wavelength regions of the visible, near infrared, middle infrared and thermal infrared parts of the electromagnetic spectrum. As wavelengths in these regions of the spectrum are strongly affected by atmospheric scattering, the usefulness of these devices for earth surface studies may be limited by atmospheric conditions. MSS devices digitally record the detected radiation in a number of defined wavelength 'channels' or 'bands'. The principle of this mode of operation is the same as that of using filters on a camera to photograph limited parts of the visible spectrum. For example, when using an appropriate filter to photograph only blue light, a purely red object would appear black since only blue radiation will pass through the filter to expose the film. For example, the Landsat 1 to 5 satellites (see Section 3) have carried a MSS which senses four regions or 'bands' of the EM spectrum. These are: green 0.5 ~ 0.6 µm (band 4) red 0.6 ~ 0.7 µm (band 5) near infrared 0.7 ~ 0.8 µm (band 6) near infrared 0.8 ~ 1.1 µm (band 7) (Bands 1 to 3 were associated with another instrument carried on the first three satellites). The spectral sensitivities of these wavelength bands are illustrated in Figure 16 and compared with the sensitivities of colour and colour infrared films in Figure 17. Figure 16: Spectral sensitivity of Landsat MSS bands. The actual sensitivity within each bandwidth varies with wavelength as shown by the different line types. 20

21 Figure 17: Comparison of the spectral sensitivity of Landsat MSS bands with the three emulsion layers used in colour and colour infrared film. [Adapted from Lillesand and Kiefer 1979] An additional scanning device called Thematic Mapper operates on Landsats 4 and 5. This instrument has the following 7 spectral channels: blue/green 0.45 ~ 0.52 µm green 0.52 ~ 0.60 µm red 0.63 ~ 0.69 µm near infrared 0.76 ~ 0.90 µm near middle infrared 1.55 ~ 1.75 µm middle infrared 2.08 ~ 2.35 µm thermal infrared ~ µm Details of other multi-spectral scanning devices used for remote sensing are presented and discussed in Section 3, while interpretation of this data is considered in Section 4. Most sensing systems in the visible and infrared regions of the EM spectrum are passive detectors of reflected solar radiation or emitted thermal radiation. Lidar (Laser Imaging radar) is an active remote sensing device which operates in the wavelength range from ultraviolet to near infrared. The laser directs pulsed or continuous radiation through a collimating system while a second optical system collects the returned radiation and focuses it onto a detector. Lidar is only effective in clear atmospheric conditions and such devices are thus aircraft-borne. Applications for lidar currently include mapping (especially in bathymetry), spectroscopy (for air and water pollution studies) and altimetry Scanner operation: Multi-spectral scanners operate in a number of different ways. They can be grouped into three basic categories depending on the mechanism used by the sensors to view each pixel. These are: a. Electromechanical: the sensor oscillates from side to side to form the image, b. Linear array: an array of detectors is used to simultaneously sense the pixel values along a line, and c. Central perspective: the sensing device does not actually move, relative to the object being sensed, during image formation so views all pixels from the same central position in a similar way to a photographic camera. These categories are illustrated in Figure 18 and discussed in more detail below. 21

22 Figure 18: Multi-spectral scanning operations. (a) Electromechanical: sensor records pixels sequentially along each line from line centre. (b) Linear array: pixels recorded simultaneously along each line using an array of detectors at line centre. (c) Central perspective: sensor is positioned at image centre and records lines sequentially. a. Electromechanical: In the case of an electromechanical scanning system being carried on an aircraft, the image is formed by a side-to-side scanning movement as the plane travels along its path, as shown in Figure 19. Each line is automatically divided into pixels in the scanning process as shown in Figure 20, with the land cover(s) within each individual pixel determining the radiation levels detected by the scanner. Figure 19: Operation of electromechanical aircraft scanner. Image lines are formed sequentially by scanning side-to-side across flight path. [Adapted from Lillesand and Kiefer 1979] Operation of electromechanical aircraft scanner. Image lines are formed sequentially by scanning side-to-side across flight path. [Adapted from Lillesand and Kiefer 1979] This type of scanner uses an oscillating mirror to reflect the radiation onto its detectors. These detectors are located behind filters which allow broad spectral bandwidths to pass through. The detected radiation is converted into a continuous electronic signal which is then sampled at regular time intervals to give discrete measurements, or pixels, along each scan line. The main limitation of this scanning mechanism is the restricted time available to read each detector. This generally requires that such scanners have rather broad spectral bands to achieve an adequate signal-to-noise ratio. The oscillating movement of the mirror may also result in some inconsistencies in the scanning rate, leading to geometric problems in the imagery (see Part FOUR, Image Rectification and Registration). Landsat MSS and TM sensors (see Section 3.2.2) use oscillating electromechanical scanners and the satellite's forward velocity to form imagery. Unlike the MSS, Landsat TM uses a scanning mirror which collects data on both the forward and reverse scans. Figure 21 shows the characteristics of the scanning system for the Landsat MSS. This system operates six sensors for each spectral band, so that six lines of image data are received each time the scanner moves from side-to-side. The satellite travels from north to south as the sensing systems scan west to east. These combined movements produce the characteristic westerly skew in Landsat imagery (see Part FOUR). Also, since the six lines are being imaged by separate detectors, slight miscalibration between the detectors can lead to a striping pattern in the imagery. This problem is further discussed in Section 5 and Part TWO. 22

23 Figure 20: MSS measurements along one scan line. Each set of radiance readings along the line represents a pixel in the image. [Adapted from Lillesand and Kiefer 1979] Figure 21: Landsat MSS operation. Six sensors for each band allow six image lines to be recorded during each scan. b. Linear array: A more recent scanner design uses a linear array of 'Charge Coupled Device' (CCD) detectors to form an image line with each detector being used to read the value for an individual pixel along the line (see Figure 18b). This design is also referred to as a 'pushbroom' scanner since the image is formed by the sensor being swept forward by the platform's velocity. The advantages of this design are that the scanner does not have any moving parts to cause timing inconsistencies and can allow a longer dwell time, and hence narrower spectral channels, per detector. This results in a cheaper, lighter and smaller device with lower power requirements and greater reliability as well as higher spatial and radiometric resolutions. A disadvantage however is that the larger number of detectors requires very accurate calibration to avoid vertical striping in the imagery. A further limitation is that CCD technology is not readily available for wavelengths longer than near infrared. In certain cases, due to finite detector read-out times, the 23

24 number of spectral bands also needs to be limited in linear array scanners to ensure that the same ground locations are being imaged in each band. The SPOT HRV (see Section 3.2.2) uses a pushbroom scanner with four arrays of 1728 detectors. The Modular Optoelectronic Multi-spectral Scanner (MOMS) which was carried on the Space Shuttle used linear array sensors. The Multi-spectral, Electronic, Self-Scanning Radiometer (MESSR) currently carried on the Japanese MOS-1 satellite is also a pushbroom scanner. A number of future sensors are being developed using this technology. c. Central perspective: The third category of scanning operation can utilise either electromechanical or linear array technology to form image lines but images each line from a perspective at the centre of the image rather than the centre of each line as in a. and b. above (see Figure 18c). This results in similar geometric distortions in an image to those which occur in photographic data. In satellite-derived imagery however, radial displacement effects are barely noticeable because the field of view is so small relative to orbital altitude. Laboratory-based scanners commonly use the central viewing perspective for image formation. The extent of distortion in the resulting image depends on the optics of the scan configuration used, including the size of the original being scanned. The early frame sensors used in vidicon cameras (such as the Return Beam Vidicon in Landsats 1, 2 and 3) operated from a central perspective by exposing a two-dimensional array of detectors (or photosensitive tubes in the early models) for a shutter controlled time period. Although operating with a different mechanism, imagery from the geostationary satellites (described in Section 3.2.2) is also essentially formed from this perspective Microwave sensors: Microwave sensors operate between the wavelengths of approximately 1 ~ 1000 mm. Microwave sensing devices can be used in both active and passive systems. In active systems, such as radars, the sensor supplies the energy input as well as detecting the response. Passive microwave sensors use the natural radio emission of the Earth and the effects on it of earth and atmospheric constituents to sense a variety of geophysical parameters. Remote sensing in the microwave wavelengths offers the opportunity of selecting bands which are unaffected by cloud cover and most weather conditions (see Section 1.3). Active microwave sensors are also independent of sun illumination and hence may be used 24 hours a day. In many applications and geographic regions, such as the perennially cloudy tropics, these features provide a guarantee of data acquisition which is not available for systems operating in visible and infrared wavelengths Radar: Active microwave sensors are based on radar devices. Radar is an acronym for RAdio Detection And Ranging, that is, using radio waves to detect objects and determine their position or 'range'. The principle of operation of these devices is to direct pulses of microwave energy at an object then record the strength, origins and sometimes polarisation of the reflected energy or 'echoes'. These pulses are transmitted for a very short time period (microseconds) and alternate with recording of echoes. The distance between the transmitter and the reflecting object is then determined from the return time of the signal echoes. Radar signals may be transmitted at a range of wavelengths. The standard bandwidths used and their letter codes are given in Table 1. 24

25 Table 1: Standard radar codes and bandwidths. These ranges are illustrated in Figure 22 relative to the radio spectrum. X, C and L bands are most commonly used for earth resource applications. The atmospheric effects on transmitted signals varies with wavelength. Slight attenuation occurs at wavelengths less than 30 mm under clear atmospheric conditions, with attenuation increasing as wavelength decreases. Heavy precipitation will provide a strong radar echo at wavelengths of 10 mm or less (this principle being used in aircraft weather detection radar systems). The transmitted radar signals can be generated so that the electrical wave vibrations are restricted to a single plane which is perpendicular to the direction of wave propagation (rather than vibrate in all directions perpendicular to that of propagation). This filtering process is referred to as polarisation. Two modes of polarisation are used in SLAR: these are referred to as horizontal (H) and vertical (V). The same or alternate mode may be used to transmit and receive radar signals, resulting in four possibilities: HH or VV imagery are referred to as 'like-polarised'; HV or VH are 'cross-polarised'. Different surface features modify the polarisation of reflected microwave energy in varying degrees, with the polarisation mode affecting the recorded echo and thus how the objects appear in the resulting imagery. 25

26 Figure 22: The microwave and radio wave spectra. Note: scales are logarithmic. [Adapted from Ulaby et al. 1981] Four types of radar-based systems are commonly used for microwave remote sensing on aircraft and satellite platforms. These are: a. Side-Looking Airborne Radar (SLAR): A radar pulse is transmitted off-nadir by an antenna fixed below an aircraft to image large ground areas adjacent to the flight line. The echoes are processed to produce an amplitude/ time video signal which is then recorded as an image line, with brighter pixels indicating higher energy returns. This procedure is illustrated in Figure 23. The oblique look angle used to acquire radar imagery results in characteristic geometric distortions such as radar shadows and layover effects. The pixel size of such imagery is determined by the pulse time length and the beamwidth of the antenna as shown in Figure 23. The time duration of the pulse determines whether the signals from adjacent objects will be overlapped and hence not distinguished separately. The beamwidth of the SLAR antenna is directly proportional to the pulse wavelength and inversely proportional to the length of the antenna. Systems which utilise the actual antenna size are called real aperture radars (RAR). Because of the physical limitation of antenna size on aircraft, such systems are restricted to short range and low altitudes (which limit the extent of coverage) and relatively short wavelengths (which experience greater atmospheric interference from attenuation and dispersion). Figure 23: Operation of Side-Looking Airborne Radar: (a) The propagation of one radar pulse is shown using solid lines to indicate the wavefront locations at time intervals 1 to 12. The reflected waves or echoes are shown by dashed lines beginning at time 7 for the house and 9 for the tree. These return signals reach the antennae at times 13 and 17 respectively. (b) The antennae response graph shows a strong echo for the house at time 13 and a weaker echo for the tree at time 17. The strength of the echo depends on the way an object reflects radio waves. (c) Pixel 26

27 size in SLAR imagery is determined by the time duration of the pulse and the beamwidth of the SLAR antennae. [Adapted from Lillesand and Kiefer 1979] b. Synthetic Aperture SLAR (SAR): The physical antenna length may be effectively lengthened by processing return signals according to their Doppler shifts (that is, a change in wave frequency as a function of the relative velocities of transmitter and receiver). This basic principle is used in Synthetic Aperture SLAR (or SAR) as shown in Figure 24. This processing requires that both amplitude and frequency signals are recorded from objects throughout the time period in which they are within the beam of the moving antenna. This weighted averaging of multiple returns, using the platform motion, effectively lengthens the antenna and allows higher spatial resolution imagery. For example, the Active Microwave Unit (AMI) planned for ERS-1, will have a 30 m pixel size and a 80 km swath width. Figure 24: Operation of Synthetic Aperture SLAR. Recording the frequency as well as the amplitude of echoes allows the surface illuminated by the radar beam to be sub-divided into 3 regions : (i) a narrow strip perpendicular to the flight line where the echo frequency matches the transmitted frequency, (ii) a region ahead of the aircraft where echoes are up-shifted in frequency, and (iii) a region behind the aircraft where echoes are down-shifted in frequency. [Adapted from Drury 1987] c. Radar altimeter: This is a non-imaging radar which detects the EM backscattering of a surface from a narrow pulse with near normal incidence. As mentioned above, the basic design of radar devices is to measure distance. By directing the pulse onto the Earth's surface from a nadir position (that is, directly overhead) the distance being measured is the altitude of the scanning platform above the surface. Over oceans, radar altimeters are used to determine significant wave height, wind speed and mesoscale topography. The latter application has produced otherwise unattainable data describing the marine geoid, and the related sea floor topography. Data from this nonimaging device can also be plotted in image format, such as to map sea surface topography, if the required data volume is available. Other applications include mapping surface topography and type of ice masses and sea/ice boundaries. A radar altimeter was carried on the Seasat satellite and one is planned for ERS-1. d. Microwave scatterometer: This device measures the microwave scattering or reflective properties of surfaces. It is also non-imaging, and is specifically designed to measure backscattering. This requires that it detect more detailed spectral information than imaging radar but has reduced spatial resolution and aerial coverage. The surface is scanned in two or more directions, usually by multiple sensors. The primary application of microwave scatterometers is to measure wind vectors (that is, speed and direction) over the ocean surface. This is based on the principle of ocean 'roughness' being caused by wind and resulting in a characteristic surface pattern which may be identified by its scattering properties. A mathematical model uses the reflectivity data with other sensor characteristics to determine wind speed and direction over the surface. Although radar scatterometers are nonimaging, their data may be used to construct global wind maps when collected over a sufficiently large area. 27

28 Microwave radiometer: These passive sensors operate in shorter microwave wavelengths than radar up to 300 mm. As mentioned above, they rely on detecting the emissions of thermal radiation in the microwave region. The range of wavelengths detected by microwave radiometers (shown in Figure 4) correspond to the very low energy end of the Earth's energy spectrum (see Figure 3). The intensity of passive microwave radiation depends on an object's temperature and incident radiation plus the emittance, reflectance and transmittance properties of the object. Because the sensors are operating in low energy levels, their imagery is relatively 'noisy', has lower spatial resolution and usually requires more complex interpretation. Passive microwave has been used extensively for sea-ice mapping and other sea parameters such as surface wind, ice extent and rainfall rates (NASA 1984b). A number of new sensors are being developed specifically for these important application areas. Microwave remote sensing systems, whether active or passive, have important implications for global climatic and weather system studies. Their data have already been used for numerous applications in meteorology and oceanography, with recent studies being directed to land applications as well. Future radar systems will offer a 'multi-band' approach to radar imagery in the same way that multi-spectral data are currently acquired in visible and infrared wavelengths. SAR also has the potential to operate at longer wavelengths where earth surface features become translucent and returns are received from subterranean features. The acquisition, processing and interpretation of microwave data is still largely experimental, although a large volume of knowledge has already been obtained. Some aspects of microwave data interpretation are discussed in Section 4.2. Of the available texts on the subject, Ulaby et al. (1981,1982,1986) offer an extensive three-volume reference on microwave remote sensing. Colwell (1983) also contains several chapters on the subject. A more descriptive review is available in Curran (1985) or Lillesand and Kiefer (1979). Drury (1987) discusses various characteristics of imaging and non-imaging microwave systems and the interpretation of radar imagery with particular reference to geological applications. 28

29 3. Platforms and Sensor Systems Remote sensing devices may be operated from a variety of platforms. These can range from elevated, but groundbased, platforms such as tripods and cherry-pickers, through balloons and aircraft at various altitudes within the Earth's atmosphere (up to about 100 km) to spacecraft which operate outside the atmosphere. These platforms provide variable reliability in terms of image quality and repetitive coverage. The most commonly used platforms for remote sensing are aircraft and spacecraft. 3.1 Aircraft In remote sensing, aircraft have most commonly been used with photographic devices, but an increasing number of airborne scanning devices are now becoming generally available. As detailed in Section 4.1, these scanners can record radiation over a wider spectral range than photographic devices and provide the data in (multi-channel) digital format. Aircraft offer maximum flexibility for acquiring remotely sensed imagery in terms of mission timing, altitude and selection of spectral regions and gains. A flight may be scheduled to coincide with the best daily or seasonal conditions and avoid poor weather or cloud cover. The image spatial resolution is determined by a combination of flight altitude and sensor characteristics so flying height can be selected to best suit an application. Aircraft imagery also potentially offers much finer spatial resolution than can currently be obtained from non-military remote sensing satellites. Airborne scanners frequently provide a larger number of, and finer, spectral bands which can be selected to detect specific targets. Similarly, the preferred view angle and direction may be specified for each flight, allowing the acquisition of stereo pairs of imagery if required. To obtain an image swath of sufficient width, aircraft scanners generally use wider scan angles than satellite scanners. This angle is referred to as the Field of View (FOV) and may be between 70 ~ 90. However, image pixel size is determined by a constant angle of view (known as the Instantaneous Field of View: IFOV). As shown in Figure 25, the effects of panoramic distortion become quite significant toward the edges of the FOV with the ground pixel size increasing many times. Figure 25: Effect of scan angle in aircraft imagery: (a) Panoramic distortion: pixel width increases significantly away from a vertical view. (b) Resulting image distortions: image features have lateral distortion when displayed with a constant pixel width. 29

30 Furthermore, aircraft necessarily provide a less stable platform than spacecraft. The resulting imagery then contains significant geometric distortions which require correction by specific rectification processing (further discussed in Part FOUR, Image Rectification and Registration). Aircraft imaging missions are also generally more expensive and more restrictive in terms of coverage than satellite derived imagery for the same area. The aircraft platform instability presents additional problems to image mosaicking between adjacent flight lines. Details of some currently available airborne scanners and imaging radars are given in Table 2. McCracken and Kingwell (1988) review a range of airborne systems used in marine and coastal applications. Appendix 1 lists some organisations in Australia which provide facilities for the acquisition of aircraft scanner imagery. The CSIRO Office for Space Science and Applications (COSSA) also has a research aircraft fitted to carry scanning equipment which is available for use on a commercial basis. Tian (1989) gives a comprehensive review of the use of Daedalus AADS 1268 scanner channels for vegetation mapping and analysis. A significant joint experiment was conducted in 1985 between CSIRO and NASA to acquire airborne scanner data for 54 study sites around Australia (see ALS 1986 for location details). This 'US-Australia Joint Scanner Project' was co-ordinated by the then CSIRO Division of Mineral Physics (now Exploration Geoscience). Initial results from the analysis of these data were presented at the Fourth Australasian Conference of Remote Sensing held at Adelaide in September

31 a) b) Table 2: Airborne scanning and imaging radar systems 31

32 3.2 Spacecraft Spacecraft, both manned and unmanned, have provided platforms for a wide range of remote sensing devices since the early days of space exploration Manned Ms Barbara Harrison & Dr David L B Jupp, Introduction to remotely sensed data Various manned spacecraft have acquired space photography and a few have operated scanning systems. These spacecraft operate between about 60 north and south of the equator at altitudes between approximately 200~500 km to enable safe re-entry and recovery of vehicle, instrumentation, and, most importantly, crew. Data from the US space program are readily available from the EROS Data Centre and NASA Goddard Space Flight Centre. Selected imagery collected by soviet cosmonauts are now also available from approved distributors. Contact addresses for these agencies are given in Appendix 2. The spatial resolution of space photography varies considerably, with reported pixel size being between 5~150 m. The operator controlled nature of image collection allows selection of solar illumination angle and viewing direction. As the mode of acquisition does not yet provide for regular, repetitive coverage, this may limit the utility of the imagery to inventory-type analyses rather than monitoring studies. The US Mercury, Gemini and Apollo series of satellites, launched during the 1960s provided over 1000 normal colour photographs and some false colour infrared photographs taken from hand-held 70 mm cameras. While these have been the subject of considerable research, their age, oblique view and restricted coverage limit their current utility for environmental applications. The Apollo mission also tested the usefulness of multi-spectral imagery taken from a simultaneously triggered array of four 70 mm cameras which were fitted with filters simulating Landsat MSS spectral bands. The positive results of this test contributed to the development of the Landsat series of unmanned satellites. Other manned spacecraft include Skylab, in the 1970s, and the Space Shuttle series, in the 1980s. Skylab was launched in May 1973 and decommissioned in February It carried several instrument packages including the EREP (Earth Resources Experiment Package) which contained a 13-channel scanner called the S192 MSS. This scanner sampled visible, near infrared, middle infrared and thermal infrared wavelengths and produced imagery of comparable spatial resolution to the Landsat MSS. The Space Shuttle series are reusable spacecraft which have been designed to commute between the Earth and space. The shuttles have carried a range of remote sensing equipment, including the Metric Camera, the first test of the Modular Optical-Electronic MSS (MOMS) and the Shuttle Imaging Radar (SIR) experiments. The latter have provided unique microwave imagery of selected regions of the Earth's surface. These have been used for research into radar applications and the development of operational, satellite-borne radar imagers. All SIR-A data (November 1981) were recorded and processed optically, so required digitising before the imagery could be computer processed. About half the SIR-B data (October 1984) were available in digital format. Both these instruments recorded imagery with HH polarisation at the L band wavelength. The SIR-C experiment planned for 1989 will allow quad-polarisation (that is, HH, VV, HV and VH) in L, X and C band wavelengths. In the USSR, the major remote sensing space platforms have been manned satellites with a manned space station having been established in These platforms carry a variety of photographic equipment at an altitude of 250 km. The equivalent pixel resolution of available Russian space photography (now distributed in Australia - see Appendix 2) varies between 5~60 m and is acquired in both multi-band or panchromatic format. Stereo coverage is already available for a large part of the world. A facility also exists for requesting acquisition of specific regions. An international space station is currently being planned for construction in the mid-1990s. This is expected to be permanently manned and includes laboratories, living quarters and support facilities. To be consistent with the Space Shuttle orbiting capabilities, this platform will orbit at a 28.5 degree inclination to the equator and at an altitude below 450 km. While it will principally house scientific and industrial experimental payloads, some provision has 32

33 been made for Earth observation. Due to its narrow orbital range, however, coverage will be rather limited. A companion series of spacecraft specifically designed for remote sensing payloads is also being developed. These spacecraft, known as the Polar Platforms or Earth Observing System (EOS), will be unmanned so are described in the section below Unmanned Unmanned spacecraft may be divided into two broad groups: polar-orbiting earth observation satellites and geostationary meteorological satellites. Geostationary satellites are positioned in an orbit at an altitude of approximately 36,000 km above the equator. In this orbit they move with the same angular velocity as the Earth rotates and so remain over the same point on the Earth's surface. From this viewpoint the same region of the Earth's surface can be imaged at regular intervals by the satellite. A series of satellites of this nature have been established by the USA, ESA and Japan and now provide global coverage for atmospheric monitoring as listed in Table 3. The Geostationary Meteorological Satellite (GMS) launched by Japan provides imagery for Australasia from its position at 140 E. The principal sensor carried by the geostationary satellites is the Visible and Infrared Spin Scan Radiometer (VISSR). On GMS, the VISSR records a visible channel (500~750 nm) with 1.25 km pixels at nadir and a thermal infrared channel (10.5~12.5 µm) with nadir (equatorial) pixels of 5 km square. The geostationary satellites are spin-stabilised on a north-south axis and, as the name implies, the VISSR uses the satellite spin (rather than an oscillating mirror as described in Section 2.2) to scan an image line. Unlike the orbiting satellites which can utilise the relative motion of the platform to separate successive scan lines in an image, these satellites use a stepping-motor to adjust the angle of view on each spin. This scanning geometry then becomes similar to that of a photograph with the scanner viewing the whole image from a central perspective (as discussed in Section 2.2). Polar-orbiting satellites provide imagery with a range of resolutions; spectrally, spatially, radiometrically and temporally. These satellites operate in sun-synchronous orbits, that is they always pass a given latitude at the same solar time. Their orbits are non-polar, covering the latitudes 82 north to 82 south of the equator with altitudes varying between 700~1500 km. The orbital characteristics of these satellites provide near global coverage of the Earth's surface on a regular and predictable basis. Data are usually acquired on request and transmitted to the nearest ground receiving station. Some sensors have recording facilities to allow collection of imagery in regions which are outside the range of ground stations. Such recorded data are then later transmitted when the satellite is within range of an appropriate receiving station. While the Landsat series of satellites have been the best-known polar-orbiters and have provided the most commonly used data, imagery is available from a wide range of satellite sensors. Some of these systems are detailed in Table 4. A more comprehensive list of currently operational satellites for earth observation is given in Table 3 of NASA (1984a). 33

34 Table 4: Satellite sensing systems. 34

35 In Australia, the receiving station for Landsat imagery is in Alice Springs. At this central location, the station maintains contact with the satellite over the largest possible land area. The extent of this area is shown in Figure 26; the near-vertical lines show the orbital paths of the satellite, the westerly skew being due to the Earth's rotation. The current Landsat 4 and 5 satellites take 231 orbits to cover the entire surface of the Earth, each orbital path being 185 km wide. It takes 16 days for the satellite to cover the whole Earth, with each full orbit taking about 100 minutes. The paths are later subdivided into rows by the Australian Centre for Remote Sensing (ACRES; formerly the Australian Landsat Station: ALS) to form 185 km square scenes (representing 25 seconds of satellite time). Each scene can then be referenced by a path number (to indicate east-west location) and a row number (to define northsouth position). The precise north-south location of a scene may be varied by defining a non-standard row but obviously the east-west location is fixed by the orbital pattern. These scenes may then be ordered as photographic prints or computer-compatible tapes. A similar system is used for other satellite imagery to define individual image scenes. Figure 26: Landsat coverage for Australia. The dots represent image scene centres. Each scene covers an area of 185 x 185 km which is referenced by the path and row numbers. [ Source: Australian Centre for Remote Sensing ] ACRES has been routinely receiving Landsat MSS data since A Micro Image Catalogue is available on colour microfiche for previewing imagery acquired from the beginning of 1984 or on black and white microfiche before this date. Limited Landsat TM data have also been recorded as part of the AMIRA/CSIRO/ACRES Signal Processing Experiment (AMIRA Project 84/P203) and are now available for general purchase. The Centre's receiving facility at Alice Springs is in the process of being upgraded to receive TM, SPOT and MOS-1 imagery. This upgrade is expected to be completed in SPOT imagery can currently be acquired over Australia by use of the on-board recorders. In this acquisition mode however, required scenes need to be pre-requested. An ordering and catalogue search facility for SPOT data is currently available via ACRES and several other SPOT data distributors in Australia and overseas. It is expected that ACRES will routinely archive SPOT imagery when their receiving station is upgraded. The SPOT HRV are currently unique satellite-borne sensors in that they allow selectable off-nadir viewing. This provides the opportunity for acquisition of stereo pairs of imagery and the ability to obtain daily coverage of selected features for short periods. Continental scale imagery is provided by AVHRR and CZCS. The AVHRR (Advanced Very High Resolution Radiometer) is currently carried on the NOAA 9 and 10 satellites. These spacecraft are part of the longest established series of low-altitude environmental satellites (TIROS/ESSA/ITOS/NOAA), TIROS&shyp;1 having 35

36 been launched by the USA in April The AVHRR imagery is used in two modes. Large Area Coverage (LAC) records 4/5 spectral channels in 1.1 km pixels while Global Area Coverage (GAC) sub-samples the LAC data to 5 x 3 km pixels. While the AVHRR was designed for meteorological, hydrologic and oceanographic studies, its imagery is also useful for land-based studies. Since April 1982, GAC data have been used by NOAA to compute global vegetation index (GVI) imagery. This imagery is produced by compositing geographically registered data sets over weekly imaging periods to reduce the effects of cloud cover and has obvious application for monitoring global vegetation resources (Justice et al. 1985). The orbital patterns of the two satellites provide two daytime and two night-time overpasses each day for any region, with the wide swath overlap at temperate latitudes actually providing more frequent coverage. This repeat cycle offers valuable coverage of dynamic events such as flood and fire. Day/night image pairs also provide data for studies of thermal inertia. A number of organisations in Australia now receive AVHRR imagery as indicated in Appendix 3c. The current coverage maps for Australia are shown in Figure 27. Due to the large volume of data, these organisations have varying procedures for image archiving. In addition to these sources however, fortnightly mosaics of GVI data covering Australia, New Zealand and Papua New Guinea are commercially available from CSIRO Division of Wildlife and Ecology (see Appendix 3c for details). This archive contains imagery from mid The CSIRO Division of Water Resources has also archived AVHRR LAC imagery covering the Murray Darling Basin since early 1983 for use in its wider research into regional estimation of total evapotranspiration and other moisture budget parameters. Figure 27: AVHRR coverage for Australia. Each view area is shown with the receiving station at the centre of the map, 0 degrees horizon and spacecraft altitude at 850 km. (a) Alice Springs, (b) Townsville, (c) Perth, (d) Melbourne, (e) Hobart. [Source: P. Tildesley, CSIRO Division of Oceanography] The CZCS (Coastal Zone Colour Scanner), carried by the Nimbus satellite, is no longer operational but numerous impressive images of eastern Australia were acquired during its imaging periods. This scanner was designed to measure chlorophyll concentration, sediment distribution and general ocean dynamics, including sea surface temperature, but also produced excellent imagery for land resource studies. Although the geometric pixel size of 36

37 CZCS (800 m) is similar to AVHRR LAC data (1100 m), the CZCS imagery appears to contain considerably more detailed spatial information, possibly due to a smaller optical pixel size. Although imagery is no longer available from this scanner, its spectral and spatial resolutions recommend its continued use as a base for the development of inventories of natural resources. Monthly composites of SMMR (Scanning Multi-channel Microwave Radiometer; also carried on the Nimbus satellite) data for Australia, New Zealand and Papua New Guinea have been archived by CSIRO Division of Wildlife and Ecology for the period 1979 ~ The data have 25 km pixel resolution and are being used for drought monitoring applications. A number of future polar-orbiting earth observation satellites are being developed. To avoid the problems associated with cloud cover in visible and infrared data, the emphasis of these systems is moving to microwave data acquisition. The Canadian Radarsat will carry a C band SAR with a movable angle of incidence, a 25 m pixel size and have a three-hour lag between data collection and availability. The principal application of this SAR is ice mapping, an area in which Canada has particular interest, and for which timely data are essential. The ESA are also designing ERS-1 to carry several radar imaging and non-imaging devices and a visible/infrared scanner similar to SPOT HRV, while the JERS-1 (Japan) will have SAR and visible/infrared imaging systems. The International Polar Platform program (also known as Earth Observing System: EOS) is being planned in conjunction with the International Space Station program mentioned in Section This unmanned component consists of a set of platforms in sun-synchronous, near-polar orbits at an altitude of 824 km, which will carry a variety of remote sensing instruments. This system will provide a unique opportunity for remote sensing data collection by the simultaneous operation of a wide range of sensors (visible/infrared scanners, spectrometers and radar). The system is expected to consist of four satellites to be launched between 1995 and Each satellite is planned to have a lifetime of 15 years with in-orbit servicing every 2 ~ 3 years to maintain or upgrade the equipment payload. NASA (1984a) outlines the background objectives of EOS, and COSSA (1987,1988) reports on the significance of this system to Australia and New Zealand remote sensing activities. 37

38 4. Data Interpretation Remote sensing relies on detecting variations in reflected or emitted radiation from the Earth's surface in different regions of the electromagnetic spectrum. A sensor actually records the 'radiance', that is the radiation within a solid angle of view, of its target. As discussed in Section 1.3, surface orientation (relative to both energy source and sensor), surface roughness (relative to the wavelengths of energy), atmospheric scattering and variations in incident radiation all affect the radiation levels detected by a remote sensing device. To account for variations in the level of incident radiation, 'reflectance' is defined as the ratio of reflected to incident radiation, that is the proportion of energy reflected by an object. In well designed remote sensing systems material properties, such as reflectance, dominate (or can be extracted from) the recorded signal. 4.1 Multi-spectral The commonly used sensor systems described earlier in Section 3 have been designed to detect particular types of earth surface materials so utilise spectral regions which are most appropriate to those materials. Figure 28 illustrates this by comparing the reflectances of major land cover types with the ranges of wavelength channels detected by various remote sensing systems. Figure 28: Positions of spectral channels in various satellite remote sensing systems relative to the spectral reflectance of major land cover categories. [ Adapted from Richards 1986 ] The Landsat MSS bands, for example, were selected primarily to detect differences in land cover. Landsat TM also provides a blue channel which is useful for water mapping, near/middle infrared for surface moisture information, middle infrared for geological discrimination and thermal infrared for soil moisture and surface temperature. Figure 29 shows three Landsat MSS bands for an agricultural region. These images have been produced by displaying the pixel values as shades of grey with the brighter shades indicating higher reflectance. 38

39 Figure 29: Landsat MSS bands for agricultural image. (a) MSS 4 (green), (b) MSS 5 (red),(c) MSS 7 (NIR), Note: MSS 6 (NIR) not shown here. The three bands shown as a false colour composite image. The reflectance level in each of these spectral bands tells us something about the 'object' being sensed: Band 4 - water is lightest in this band; vegetation is dark; stubble is bright, Band 5 - water is getting darker; vegetation is darkest in this band; stubble is bright, [Band 6 - water is darker again; vegetation is much brighter; stubble is bright,] Band 7 - water is darkest; vegetation is brightest; stubble is bright. The changes in vegetation reflectance levels across the four bands are clear for the agricultural fields in the image. Cultural features such as roads are typically light and visible in all bands. Water bodies such as rivers and lakes are lightest in MSS 7. The reflectance levels depend upon a wide range of factors relating to the structure and condition of a particular land cover type. For example, Figure 5 in Section 1 illustrated how the spectral reflectance of vegetation relates to the structure in a healthy plant. As indicated in that figure and shown in Figure 30, the influence of chlorophyll and other pigments (leading to absorption in the blue and red regions and reflectance of green light) controls the response of vegetation to irradiation in the visible wavelengths. The distinctively high reflectance of vegetation in 39

40 the 0.7 ~ 1.2 µm region is dominated by the internal structure (cell walls) of the vegetative materials. Water absorption bands occur at various wavelengths throughout this region and in the continuation of the spectrum out to 2.5 µm. Variations in water content in the vegetation further influence the overall heights of response curves beyond the chlorophyll absorption band at 0.65 µm. As a plant becomes diseased, however, its structure changes so its reflectance characteristics also change as illustrated in Figure 31. These 'signature diagrams' have been drawn from spectral reflectance data with much finer spectral resolution than those provided by the Landsat MSS bands. Figure 32 shows the vegetation spectral signature as detected by the Landsat MSS. Figure 30: Typical reflectance and absorption characteristics of green vegetation. [ Adapted fro Swain and Davis 1978 ] Figure 31: Effect of disease on vegetation reflectance. [ Adapted from Barrett and Curtis 1976 ] Figure 32: Typical green vegetation reflectance as recorded by Landsat MSS bands. [ Adapted from Swain and Davis 1978 ] Vegetation features are frequently identified by their characteristically high near infrared reflectance and low red reflectance. For an image pixel containing both vegetation and background soil, the colour and visibility of the soil will modify the detected radiation levels. A light coloured soil will show a larger decrease in red radiation for increasing vegetative cover while a dark soil will cause a larger increase in the near infrared radiation. Image processing techniques of ratioing or differencing these data channels enhances the differences between them and 40

41 reduces the influence of background soil colour. These techniques are detailed in Part TWO and reviewed by Tian (1989). An early indication of deteriorating plant health is a decrease in the actual wavelength at which reflectance increases between red and near infrared. This reflectance change is called the 'red shift' and occurs before a change in condition is visible in the plant itself. However as the shift occurs over a very small spectral range, it requires very narrow spectral channels for detection. The spectral channels of satellite-borne scanning devices are much too coarse for this purpose, however some aircraft-borne scanners may be used to detect these small spectral changes. The major characteristic of soil which affects spectral response in visible and infrared wavelengths is moisture content, with organic content, texture, structure and iron oxide content also being important (Curran 1985). Increased moisture and organic matter both decrease reflectance while iron oxide reflects red and absorbs green light. As seen in Figure 28, irradiance on water surfaces is mostly absorbed or transmitted rather than reflected. The strong absorption of near infrared energy, in contrast to the high reflectance of soil and vegetation at this wavelength, clearly defines land/water boundaries. Variations in reflectance from water surfaces may be due to changes in water depth, turbidity, water colour, substrate type (if shallow) and/or surface texture. Generally reflectance increases with decreasing water depth or particulate content. Water colour may be due to suspended soil particles or organic materials such as dissolved tannins or chlorophyll. Two regions of the thermal infrared wavelength region, corresponding to atmospheric windows, may be used for remote sensing: 3 ~ 5 and 8 ~ 14 µm. The shorter wavelength region is predominantly used for sensing very hot objects, such as fires, while the 8 ~ 14 µm region coincides with the peak energy emission for the Earth so is used for most applications. Unlike reflected energy, emitted energy such as thermal infrared may also be detected at night. For example, AVHRR imagery is available for day and night overpasses. Thus day/night pairs of thermal imagery may be obtained for an area and these are frequently used to study properties such as thermal inertia. Thermal infrared data may also be calibrated to indicate apparent surface temperature. This technique is commonly used for sea surface temperature mapping (where actual sea surface temperature can be measured to within 2 degrees) and in meteorological applications. The relative temperature differences indicated by the thermal infrared channel may also be used in land areas to indicate differences in soil moisture (such as with irrigated crops) and/or vegetative cover. 4.2 Microwave Numerous characteristics of terrain determine the intensity of returns for a given wavelength and polarisation of radar signals. The major factors influencing an object's return signal intensity are considered to be relief, surface roughness and electrical characteristics. The side-looking nature of imaging radar devices records relief characteristics with higher returns from sensorfacing slopes, and low or no returns from away-facing slopes (or radar shadows) in much the same way as an offnadir sun position differentially illuminates a landscape with varying relief. Depending on an object's height, and radar signal characteristics, relief displacement can occur in radar imagery, with the return from the top of an object (such as a hill or building) being recorded before the return from its base. This produces a characteristic layover effect, which is maximised at short range. At longer ranges, a foreshortening effect can occur with the size of a sloped surface being compressed in the image. Surfaces with local relief variations which are greater than, or equal to, the wavelength of the radar signal will scatter the signal in all directions (diffuse reflectors) and thus return only a small portion of the signal to the sensor (see Figure 7). Smooth surfaces, with surface roughness which is less than the wavelength of the radar energy, tend to reflect most of the energy away from the sensor (specular reflectors) thus producing a low return signal. However, the actual response will obviously depend on surface orientation relative to the sensor, so that if the surface is perpendicular to the transmitted signal, the returned signal will be directed precisely toward the sensor. Similarly, a corner reflector (adjacent smooth surfaces which doubly reflect the signal) can result in a high radar return. More 41

42 surfaces tend to be specular reflectors for microwave wavelengths than in the visible and infrared wavelengths with some objects which appear rough in visible imagery, such as roadways, appearing smooth in radar imagery. Materials differ in terms of their electrical reflectivity and conductivity. One measure of these characteristics is called the 'complex dielectric constant'. For most dry natural materials, the magnitude of this constant has relatively low values (3 ~ 8) in the microwave region of the electromagnetic spectrum. Water, however has a value of approximately 80. Thus, water, or moisture in soil or vegetation, significantly increases radar backscatter from vegetation and rough soil surfaces. Metal objects also have characteristically high dielectric constant values. Vegetation cover can scatter radar signals in a number of ways: volume scattering by leaves or grass, scattering from stems, corner reflection from trunks, as well as returns from the ground layer through gaps in the vegetation layers. The relative strengths of these effects depends on the radar frequency. At L band frequencies, a moderate cover of grass or trees completely masks the soil effect and volume scattering dominates (see Zoughi et al., 1986,1987; Hirosawa et al., 1987; and Hoekman, 1987). When there is little vegetation cover, the soil microrelief dominates the X band signal followed in strength of effect by the moisture content of the soil surface crust. The local slope and aspect modifies the radar signal by decreasing or increasing the local angle of incidence. Slopes towards the aircraft are enhanced and slopes away are darkened. This effect provides the basis for the use of radar to map landform and geological structure. At X band, the effect of the vegetation is strong and C or L bands may be better for structural mapping in densely vegetated areas. The use of different wavelengths and polarisations allows 'multi-spectral' radar imaging which can indicate both structure and properties of the Earth's surface and near surface soil layers. In view of the above factors, the orientation and position of the flight line directly affects the resulting radar image. The side-looking nature of SLAR sensors can also result in slant range scale distortions in the imagery (since the slant distance does not vary consistently with respect to ground distance) which requires subsequent geometric rectification to be able to relate image features to a map base. However, if an object is imaged from two different flight lines, the relief displacements cause image parallax which allow the imagery to be viewed stereoscopically. As pairs of imagery which are produced by viewing terrain features from opposite sides will have reversed side-lighting effects, stereo pairs are usually produced from flights along the same path at different altitudes, the lower image having the greater extent of shadow. This altitude parallax characteristic may also be used to determine approximate heights of image features. Passive microwave is used for atmospheric observations by detecting the thermal emission of broad atmospheric layers, with altitude being determined by radio frequencies. The atmospheric temperature and composition can be modelled using the microwave spectrum data with other data used for calibration (such as infrared or meteorological forecasts). Other applications include determination of sea surface temperature for global climate studies, ocean surface wind speed, rain rate, and the age and characteristics of ice and snow cover. There are detectable differences in microwave emission intensity between water and sea-ice, the differentiation of which is important for climatic studies. 42

43 5. Data Resolution and Extent Remotely sensed data provide a synoptic or regional view of the Earth's surface as well as the opportunity to identify particular features of interest. Analysis techniques frequently relate particular data values in an image to certain ground features, or to parameters which identify those features. However, the data acquisition methods of remote sensing implicitly involve at least one level of indirection. For example, a particular study may aim to determine vegetative cover and condition. Since such parameters are not directly measurable using remote sensing, they must be related to a property of vegetation which can be 'measured' remotely, namely reflectance. A further limitation which must be considered is that the data we collect using remote sensing only sample the potential range of measurements in the selected 'measurement space'. Figure 33 shows the indirect relationship between the data and measurement spaces. Figure 33: Relationship between the measurement and analysis of image features. To relate earth surface features or parameters to remotely sensed data the intrinsic ability of the parameters to be resolved in the type of measurements being made must be considered together with the effectiveness of the models which relate physical processes to these measurements. The question of the structural model is too broad to consider here, but the issue of measurement is both fundamental and vital to planning remotely sensed data acquisition and its subsequent processing. The usual measurement space for remotely sensed data can have a variety of measurement 'dimensions', such as intensity, wavelength and position. These can be considered as providing a co-ordinate framework in space and time. A data set is usually a sample in this co-ordinate frame. The objective of any data analysis exercise is to distinguish effects and/or events in the data. To achieve this objective, a data set must be sufficiently resolved and cover a large enough extent. Resolution refers to the intensity or rate of sampling, and extent refers to the overall coverage of a data set. Extent can be seen as relating to the largest feature, or range of features, which can be observed, while resolution relates to the smallest. For a feature to be distinguishable in the data, the resolution and extent of the measurement dimensions of the data set need to be appropriate to the measurable properties of the feature. For a feature to be separable from other features, these measurements must also be able to discriminate between the differences in reflectance from the features. Resolution and extent can be seen to operate in four 'dimensions' of remotely sensed data acquisition: 43

44 a. Spectral-resolution relates to the width of wavelength channels, extent describes the number and spectral range of channels in the image b. Spatial-resolution relates to pixel size, extent to the overall image coverage c. Radiometric-resolution relates to the energy difference which determines different radiation (or brightness) levels in an image, extent to the number of levels detected d. Temporal-resolution relates to the repeat cycle or interval between successive acquisitions, extent to the total period over which imagery is available. Section 4 related remotely sensed measurements to interpretative parameters. This Section is concerned with the measurement model which is implicit in remotely sensed data and the way in which these four dimensions of data acquisition can affect its interpretative value. The suitability of a particular remotely sensed data source to a specific application will depend on the resolution and extent of all data dimensions. While the final selection of a data set is usually a compromise involving other factors such as cost and project timing, these aspects need to be carefully considered to ensure that the features to be identified can be adequately discriminated in the chosen data set. This concept is discussed in more detail in Chapter 1 of Colwell (1983). 5.1 Spectral As indicated in the preceding sections, different materials respond in different, and often distinctive, ways to EM radiation. This means that a specific spectral response curve, or spectral signature, can be determined for each material type. Basic categories of matter (such as specific minerals) can be identified on the basis of their spectral signatures alone, but may require that the spectra be sufficiently detailed in terms of wavelength intervals and covers a wide spectral range. Composite categories of matter (such as soil which contains several different minerals) however, may not be uniquely identifiable on the basis of spectral data alone. Remote sensing devices generally only sample the EM spectrum by detecting the combined radiation over a range of wavelengths. For example, a sensor which is receptive to wavelengths in the range 0.4 ~ 0.5 µm would be sensing 'blue' light. This range is referred to as a spectral band, or channel, of data in an image. The spectral ranges which can be used for remote sensing of the Earth's surface are limited to the atmospheric windows, as described in Section 1.3. As discussed in Section 2, photographic recording devices only detect radiation within the ultra violet to near infrared regions of the electromagnetic spectrum (depending on the type of film being used). Within this range, spectral channels are defined by using filters to block out wavelengths outside the required sub-region. Multispectral scanners, however, may also detect radiation in the middle and thermal infrared regions as discussed above. Microwave radiation may be recorded using radar and passive microwave sensors. Aircraft-borne scanners are typically used to provide increased spectral resolution relative to satellite-borne scanners. Many individual minerals, for example, have diagnostic narrow absorption bands at known wavelengths. Spectrometers can be used in laboratory or field work to identify specific types of mineral. Airborne Imaging Spectrometers (AIS) have recently been developed to detect wavelength bands as narrow as 2 nm. The satelliteborne High-Resolution Imaging Spectrometer (HIRIS) planned for the Earth Observation System will have bandwidths of approximately 10 nm to allow identification of most minerals. However, in most satellite imagery, the channels are selected to differentiate between the major cover types. One advantage of airborne imagery is that the position, width and number of spectral channels being sensed can be tailored to the specific cover type or range of covers of interest. Spectral extent describes the range of wavelengths being sensed in all channels of an image. Spectral resolution can be defined in terms of both the number of spectral channels being imaged over a given spectral region and the range of wavelengths incorporated into each single channel. An increase in spectral resolution over a given spectral range will result in a greater number of spectral channels. However, this additional resolution also 'costs' in terms of increased data volume and the consequent increase in costs associated with its processing. The theoretically optimal 44

45 spectral range and resolution for a particular cover type therefore may need to be modified with respect to the practical considerations of data collection and processing. 5.2 Spatial Ms Barbara Harrison & Dr David L B Jupp, Introduction to remotely sensed data Spatial resolution defines the level of spatial detail depicted in an image. This may be described as a measure of the smallness of objects on the ground that may be distinguished as separate entities in the image, with the smallest object necessarily being larger than a single pixel. In this sense, spatial resolution is directly related to image pixel size. In terms of photographic data, an image pixel may be compared to grain size while spatial resolution is more closely related to photographic scale. In practical terms, the 'detectability' of an object in an image involves consideration of spectral contrast as well as spatial resolution. Feature shape is also relevant to visual discrimination in an image with long thin features such as roads showing up more readily than smaller symmetric ones. Pixel size is usually a function of the platform and sensor, while the detectability may change from place to place and time to time. The exact placement of an image pixel grid on the Earth's surface is unpredictable with air- and space-borne sensors. Consequently, a pixel-sized feature which has a contrasting reflectance to its background may be imaged as a single pixel in the image but would more typically be imaged as a part of two or even four pixels. The combined radiation of the feature and its background are then detected as the radiance values of those pixels. Radiation levels of different components within a pixel combine in a complex way, with the average radiance of four pixels of 25 m square not necessarily being the same as the radiance of a 50 m square pixel covering the same area. An additional factor here is the surface smoothness, since this affects the strength and direction of radiation. The pixel size for aircraft- and satellite-borne scanners is a function of both the sensor (optics and sampling rate) and the platform (altitude and velocity). Landsat MSS has a nominal pixel size of 60 x 80 m, which produces a 'continuous' (that is, unblocky) image at a scale of 1:100,000. CZCS has a pixel size of 800 x 800 m, which gives a 'natural' scale of 1:1,000,000. The latter has good large area coverage, but poor detail for specific features such as individual reefs. More recent satellite scanners, such as the Landsat Thematic Mapper or SPOT, have smaller pixels (30 m and 10/20 m respectively) giving fine detail for specific features, but generating too much data to be used conveniently for large area studies. Figure 34 presents satellite views of Canberra from remote sensing devices with differing spatial resolutions. (a) (b) Figure 34: Canberra imaged with different spatial resolutions: (a) Landsat MSS - 80 metre pixel, (b) SPOT MSS - 20 metre pixel (copyright CNES 1989) Measurements in remotely sensed imagery are usually obtained by sampling the Earth's surface using a constant view angle. This angle is referred to as the Instantaneous Field of View (IFOV) and determines the ground area 'viewed' to form a pixel, or the 'optical' pixel size. The optical pixel is generally a circular or elliptical shape and it is the combined radiation from all components within its ground area that produces the single pixel value in a given spectral channel. Along each image line, optical pixel values are sampled and recorded. The average spacing of optical pixels relative to the ground area covered is then used to determine the geometric pixel width, with the line spacing along the platform path being used in a similar way to derive geometric pixel depth. In most cases, the 45

46 geometric pixel dimensions are referred to as the nominal image pixel size. The geometric pixel size is usually similar to the optical pixel size although there is often some overlap between adjacent optical pixels as shown in Figure 35. Forster (1980) estimated that only 52% of the radiation recorded for a Landsat MSS pixel is due to the area of the geometric pixel. In most applications, this does not present a problem since a pixel is likely to have a similar radiance to its neighbour. However, in urban areas with high variability of cover relative to the Landsat MSS pixel size, this effect is significant. Figure 35: Optical pixel versus geometric pixel size. The optical pixel width (that is, the area imaged by the sensor) is A but these ground areas overlap. To account for overlapping pixels in image scaling the geometric pixel size is given as B. These considerations are also relevant to the selection of a grid size and shape for scan digitising. The effect of this digitising process is shown in Figure 36 for an artifical image. The size of individual ground features relative to the pixel size largely determines their contribution to the radiance measurement of a pixel. Object shape discrimination is severely degraded as pixel size approaches the object size, though the presence of the object may still be inferred up to this limit. Recent work on sub-pixel modelling is reported in Jupp et al. (1986). Image statistics relating to such spatial patterns are considered in Part TWO, Introduction to Image Processing, and Woodcock and Strahler (1987). a) b) c) d) e) f) Figure 36: Effect of pixel size in the digitising process. In this synthetic image, circles represent tree crowns of 6m diameter on a contrasting background. (a) 10 cm pixel, (b) 50 cm pixel, (c) 1 m pixel, (d) 2 m pixel. (e) 4 m pixel, (f) 8 m pixel Another spatial aspect of imagery is the extent of coverage of an imaged scene. This is usually referred to as swath width across the satellite orbit or aircraft flight path and varies with the platform altitude and the total scan angle (or Field Of View: FOV) of the sensor. Aircraft scanners typically employ a wide FOV to increase ground coverage. As 46

47 illustrated in Section 3 and Part FOUR, Image Rectification and Registration, this results in geometric distortions in the imagery. Continental scale satellite data, such as CZCS and AVHRR, also use wide scanning angles and produce imagery with severe along line distortions due to the combined effects of earth curvature and panoramic distortion. In these cases, the nominal pixel size is usually quoted as at nadir. In multi-channel data, a slight spatial misregistration between channels also occurs due to the time required to read each sensor. In the case of Landsat MSS, with six image lines being recorded in each scan, this delay causes a visible stepping between individual image lines in rectified imagery. The intra-channel delay, however, is usually considered to be sufficiently small so that image quality is not degraded. The required pixel size is an important consideration in any remote sensing exercise, being a trade-off basically between cost, for processing time, and detail, or image resolution. The assumption that a smaller pixel size is superior for all applications is unfounded. An inventory for an area the size of NSW would require hundreds of SPOT images, dozens of Landsat scenes or one CZCS image. Much of the detail provided by the smaller pixel size would not be required in such a study. The size of the features being studied becomes important here. Investigations of small area features such as algal blooms and river outflows will obviously require imagery with pixels somewhat smaller than the feature size. However, the broad scale patterns required in a large area study may be masked by the detail of the small area pixels, or, at least, require a considerable amount of processing to be extracted. This concept of spatial scale is further discussed in Part TWO, Introduction to Image Processing, in terms of image statistics. 5.3 Radiometric Radiometric resolution in remotely sensed data is defined as the amount of energy required to increase a pixel value by one quantisation level or 'count'. The radiometric extent is the dynamic range or the maximum number of quantisation levels that may be recorded by a particular sensing system. Most remotely sensed imagery is recorded with quantisation levels in the range 0 ~ 255, that is, the minimum 'detectable' radiation level is recorded as 0 while the 'maximum' radiation is recorded as 255. This range is also referred to as 8 bit resolution since all values in the range may be represented by 8 bits (binary digits) in a computer. Radiometric resolution in digital imagery is comparable to the number of tones in a photographic image, both measures being related to image contrast. Quantisation levels are frequently given in terms of the number of bits rather than the number or range of levels. These values are related by: In image processing, quantisation levels are usually referred to as Digital Numbers (DN). The effect of changes in radiometric resolution on image feature contrast is illustrated in Figure 37. As discussed in Section 1.4, the human eye can only perceive 20 ~ 30 different grey levels so the additional resolution provided by images with more than about 30 levels is not visually discernible. This sequence of images emphasises the value of interpreting imagery using digital techniques to derive maximum discrimination from the available radiometric resolution. 47

48 a) b) c) d) e) f) Figure 37: Effect of changes in radiometric resolution on image feature contrast. Number of quantisation levels used: (a) 2, (b) 4, (c) 8, (d) 16, (e) 32, (f) 64 Since the radiometric resolution defines the maximum number of quantisation levels detectable by a sensor, it is most unlikely that a single remotely sensed image would actually contain data values covering the entire range. For Landsat MSS data, most imagery would only contain a maximum range of 50 ~ 100 DN (or less) in each channel. This situation is usually simply due to the range of objects being imaged, since the detector sensitivities are generally selected to cope with the brightest and darkest objects of interest and a single image would rarely contain both. Some sensing systems (such as CZCS or Geoscan Mark 2 scanner) have adjustable sensitivities (or gains) for imaging over different surface features, the principal difference being between land and water. Atmospheric scattering and absorption effects decrease the radiometric extent of an image by reducing the discrimination between different radiation levels, especially at shorter wavelengths. Methods for improving contrast within an image for presentation and interpretation purposes are discussed in Part TWO, Introduction to Image Processing. It should be noted however, that these techniques can not improve the radiometric resolution of the data itself, as this is dependent on the scanning instrument, but only alter the visual contrast in an image. Landsat MSS imagery, while distributed as 8 bit data, is actually recorded as 6 bit data with bands 4, 5 and 6 being recorded non-linearly to have an effective data range of 7 bits. Since the imagery is formed using 6 different detectors (see Section 2.2.1), the final conversion to 8 bit format is combined with a rescaling process which 48

49 accounts for calibration differences between sensors. By expanding the data range during this process, fractional differences between sensors in terms of the original range can be represented more accurately (that is, with fewer rounding errors) in the final expanded range. microbrian Version 2 handles all image data as 8 bit (that is data values in the range 0 ~ 255; Version 3 will process both 8 bit and 16 bit image data). In cases where the data were recorded with greater radiometric resolution, such as AVHRR (10 bit), the sub-range of values actually contained in the image is converted to fill the range 0 ~ 255. For example, the data extent of the AVHRR thermal infrared channel has been designed so that value 0 records a brightness temperature of approximately -273 C and value 1023 relates to about 50 C. This range allows recording of detailed information from targets covering a wide range of temperature such as clouds, oceans and earth surface features. The latter are only ever recorded as a portion of the full data range, so the sub-range rescaling process used during conversion to 8 bit data preserves the resolution of the original data as much as possible. 5.4 Temporal The temporal resolution of remotely sensed data refers to the repeat cycle or interval between acquisition of successive imagery. This cycle is fixed for spacecraft platforms by their orbital characteristics, but is quite flexible for aircraft platforms. Satellites offer repetitive coverage at reduced cost but the rigid overpass times can frequently coincide with cloud cover or poor weather. This can be a significant problem when field work needs to coincide with image acquisition. While aircraft data are necessarily more expensive than satellite imagery, these data offer the advantage of user-defined flight timing, which can be modified if necessary to suit local weather conditions. The offnadir viewing capability of the SPOT-HRV provides some flexibility to the usual repeat cycle of satellite imagery by imaging areas outside of the nadir orbital path. This feature allows daily coverage of selected regions for short periods and has obvious value for monitoring dynamic events such as flood or fire. As with the other dimensions, the 'ideal' temporal resolution will obviously vary considerably for different applications. Studies of thermal inertia may require two images per day (day/night pairs) but land cover monitoring projects may only require one image each year. While monitoring is cited as one of the major values of remotely sensed data, the processing and interpretation of multi-temporal imagery presents a wide range of problems. These temporal changes can also affect interpretation of SPOT stereo pairs if the imagery were acquired in different seasons or atmospheric conditions. The temporal dimension has special significance since we are subject to the passage of time ourselves. We are vitally interested in how things change yet are also constrained in our ability to observe these changes in an objective fashion. Temporal differences between remotely sensed imagery can occur for a number of reasons: atmospheric differences, changes in sun position during the course of a day and during the year and the effect of seasonal cycles in vegetation or water bodies. Atmospheric differences between two dates may alter the measured radiation levels actually recorded in the imagery when in fact the reflectance of surface features has not altered. Digital image processing techniques which correct for these effects are considered in Part TWO, Introduction to Image Processing. Atmospheric effects are most significant in the short-wave visible and ultraviolet spectral regions. These effects are usually minor however (except for cloud of course) and should be minimised if both images were acquired at the same time of year. As well as possible atmospheric changes with seasonal differences, both sun position and vegetation characteristics vary significantly during the annual cycle. Figure 38 illustrates the differences in the extent of topographic shading in remotely sensed imagery due to changes in sun position in different seasons. As well as these shadowing changes however, the reflectance of deciduous vegetation and grasslands will also vary between seasons. These relatively short term, cyclic changes in land cover can greatly complicate the process of detecting longer term changes. The coincidence of field work and image acquisition is essential in dynamic environments such as the wet/dry tropics since grass cover can change from green to brown in less than one month at the end of the wet season. A pixel containing a green grass background has a much higher level of near infrared radiation than a pixel with a background of dry grass. Near infrared reflectance of perennial woody vegetation can also increase quickly after 49

50 rain. This is especially true for many opportunistic Australian trees and shrubs (being adapted to making the most of rain in a dry environment), as the active cell division (due to both new leaves and blade expansion in old leaves) significantly increase reflectance of near infrared radiation (Dean Graetz, pers. comm). Figure 38: Effect of seasonal differences in sun position on illumination. The DTM image (distributed with the microbrian system) was used with microbrian program minsol to simulate the surface shading efects for sun positions in: (a) Summer, (b) Autumn, (c) Winter, (d) Spring, (e) DTM channel displayed with light shades corresponding to high elevation and dark shadows indicating low elevation. Diurnal variations between imagery can also result in shadowing and atmospheric differences. These may occur with imagery acquired on different platforms or, in the case of aircraft data, at different times during the day. The importance of sun position to remote sensing cannot be over-emphasised. Current work by Jupp et al. (1986) and Walker et al. (in prep.) demonstrates the effect of changes in sun position on image radiance values for synthetic and remotely sensed imagery. Particular seasons may be optimal for remote sensing of certain features. Crop cycles will dictate the timing required for image acquisition in agriculture. In the wet/dry tropics, a mid-dry season image appears to be optimal for stratifying land cover components; an early-dry season image is too 'green' and a late-dry season image is often riddled with fire scars. Alternatively, in the temperate regions, studies relating to landform are best conducted using winter imagery when land cover differences will be minimised. Thus, acquisition date can greatly affect the value 50

Microwave Remote Sensing (1)

Microwave Remote Sensing (1) Microwave Remote Sensing (1) Microwave sensing encompasses both active and passive forms of remote sensing. The microwave portion of the spectrum covers the range from approximately 1cm to 1m in wavelength.

More information

An Introduction to Remote Sensing & GIS. Introduction

An Introduction to Remote Sensing & GIS. Introduction An Introduction to Remote Sensing & GIS Introduction Remote sensing is the measurement of object properties on Earth s surface using data acquired from aircraft and satellites. It attempts to measure something

More information

Geo/SAT 2 INTRODUCTION TO REMOTE SENSING

Geo/SAT 2 INTRODUCTION TO REMOTE SENSING Geo/SAT 2 INTRODUCTION TO REMOTE SENSING Paul R. Baumann, Professor Emeritus State University of New York College at Oneonta Oneonta, New York 13820 USA COPYRIGHT 2008 Paul R. Baumann Introduction Remote

More information

Atmospheric interactions; Aerial Photography; Imaging systems; Intro to Spectroscopy Week #3: September 12, 2018

Atmospheric interactions; Aerial Photography; Imaging systems; Intro to Spectroscopy Week #3: September 12, 2018 GEOL 1460/2461 Ramsey Introduction/Advanced Remote Sensing Fall, 2018 Atmospheric interactions; Aerial Photography; Imaging systems; Intro to Spectroscopy Week #3: September 12, 2018 I. Quick Review from

More information

FOR 353: Air Photo Interpretation and Photogrammetry. Lecture 2. Electromagnetic Energy/Camera and Film characteristics

FOR 353: Air Photo Interpretation and Photogrammetry. Lecture 2. Electromagnetic Energy/Camera and Film characteristics FOR 353: Air Photo Interpretation and Photogrammetry Lecture 2 Electromagnetic Energy/Camera and Film characteristics Lecture Outline Electromagnetic Radiation Theory Digital vs. Analog (i.e. film ) Systems

More information

Govt. Engineering College Jhalawar Model Question Paper Subject- Remote Sensing & GIS

Govt. Engineering College Jhalawar Model Question Paper Subject- Remote Sensing & GIS Govt. Engineering College Jhalawar Model Question Paper Subject- Remote Sensing & GIS Time: Max. Marks: Q1. What is remote Sensing? Explain the basic components of a Remote Sensing system. Q2. What is

More information

ACTIVE SENSORS RADAR

ACTIVE SENSORS RADAR ACTIVE SENSORS RADAR RADAR LiDAR: Light Detection And Ranging RADAR: RAdio Detection And Ranging SONAR: SOund Navigation And Ranging Used to image the ocean floor (produce bathymetic maps) and detect objects

More information

Outline. Introduction. Introduction: Film Emulsions. Sensor Systems. Types of Remote Sensing. A/Prof Linlin Ge. Photographic systems (cf(

Outline. Introduction. Introduction: Film Emulsions. Sensor Systems. Types of Remote Sensing. A/Prof Linlin Ge. Photographic systems (cf( GMAT x600 Remote Sensing / Earth Observation Types of Sensor Systems (1) Outline Image Sensor Systems (i) Line Scanning Sensor Systems (passive) (ii) Array Sensor Systems (passive) (iii) Antenna Radar

More information

746A27 Remote Sensing and GIS

746A27 Remote Sensing and GIS 746A27 Remote Sensing and GIS Lecture 1 Concepts of remote sensing and Basic principle of Photogrammetry Chandan Roy Guest Lecturer Department of Computer and Information Science Linköping University What

More information

Introduction to Remote Sensing

Introduction to Remote Sensing Introduction to Remote Sensing Spatial, spectral, temporal resolutions Image display alternatives Vegetation Indices Image classifications Image change detections Accuracy assessment Satellites & Air-Photos

More information

remote sensing? What are the remote sensing principles behind these Definition

remote sensing? What are the remote sensing principles behind these Definition Introduction to remote sensing: Content (1/2) Definition: photogrammetry and remote sensing (PRS) Radiation sources: solar radiation (passive optical RS) earth emission (passive microwave or thermal infrared

More information

Microwave Remote Sensing

Microwave Remote Sensing Provide copy on a CD of the UCAR multi-media tutorial to all in class. Assign Ch-7 and Ch-9 (for two weeks) as reading material for this class. HW#4 (Due in two weeks) Problems 1,2,3 and 4 (Chapter 7)

More information

Remote Sensing. Ch. 3 Microwaves (Part 1 of 2)

Remote Sensing. Ch. 3 Microwaves (Part 1 of 2) Remote Sensing Ch. 3 Microwaves (Part 1 of 2) 3.1 Introduction 3.2 Radar Basics 3.3 Viewing Geometry and Spatial Resolution 3.4 Radar Image Distortions 3.1 Introduction Microwave (1cm to 1m in wavelength)

More information

Interpreting land surface features. SWAC module 3

Interpreting land surface features. SWAC module 3 Interpreting land surface features SWAC module 3 Interpreting land surface features SWAC module 3 Different kinds of image Panchromatic image True-color image False-color image EMR : NASA Echo the bat

More information

An Introduction to Geomatics. Prepared by: Dr. Maher A. El-Hallaq خاص بطلبة مساق مقدمة في علم. Associate Professor of Surveying IUG

An Introduction to Geomatics. Prepared by: Dr. Maher A. El-Hallaq خاص بطلبة مساق مقدمة في علم. Associate Professor of Surveying IUG An Introduction to Geomatics خاص بطلبة مساق مقدمة في علم الجيوماتكس Prepared by: Dr. Maher A. El-Hallaq Associate Professor of Surveying IUG 1 Airborne Imagery Dr. Maher A. El-Hallaq Associate Professor

More information

Module 3 Introduction to GIS. Lecture 8 GIS data acquisition

Module 3 Introduction to GIS. Lecture 8 GIS data acquisition Module 3 Introduction to GIS Lecture 8 GIS data acquisition GIS workflow Data acquisition (geospatial data input) GPS Remote sensing (satellites, UAV s) LiDAR Digitized maps Attribute Data Management Data

More information

Chapter 16 Light Waves and Color

Chapter 16 Light Waves and Color Chapter 16 Light Waves and Color Lecture PowerPoint Copyright The McGraw-Hill Companies, Inc. Permission required for reproduction or display. What causes color? What causes reflection? What causes color?

More information

John P. Stevens HS: Remote Sensing Test

John P. Stevens HS: Remote Sensing Test Name(s): Date: Team name: John P. Stevens HS: Remote Sensing Test 1 Scoring: Part I - /18 Part II - /40 Part III - /16 Part IV - /14 Part V - /93 Total: /181 2 I. History (3 pts. each) 1. What is the name

More information

Important Missions. weather forecasting and monitoring communication navigation military earth resource observation LANDSAT SEASAT SPOT IRS

Important Missions. weather forecasting and monitoring communication navigation military earth resource observation LANDSAT SEASAT SPOT IRS Fundamentals of Remote Sensing Pranjit Kr. Sarma, Ph.D. Assistant Professor Department of Geography Mangaldai College Email: prangis@gmail.com Ph. No +91 94357 04398 Remote Sensing Remote sensing is defined

More information

SATELLITE OCEANOGRAPHY

SATELLITE OCEANOGRAPHY SATELLITE OCEANOGRAPHY An Introduction for Oceanographers and Remote-sensing Scientists I. S. Robinson Lecturer in Physical Oceanography Department of Oceanography University of Southampton JOHN WILEY

More information

Lecture 13: Remotely Sensed Geospatial Data

Lecture 13: Remotely Sensed Geospatial Data Lecture 13: Remotely Sensed Geospatial Data A. The Electromagnetic Spectrum: The electromagnetic spectrum (Figure 1) indicates the different forms of radiation (or simply stated light) emitted by nature.

More information

746A27 Remote Sensing and GIS. Multi spectral, thermal and hyper spectral sensing and usage

746A27 Remote Sensing and GIS. Multi spectral, thermal and hyper spectral sensing and usage 746A27 Remote Sensing and GIS Lecture 3 Multi spectral, thermal and hyper spectral sensing and usage Chandan Roy Guest Lecturer Department of Computer and Information Science Linköping University Multi

More information

Sommersemester Prof. Dr. Christoph Kleinn Institut für Waldinventur und Waldwachstum Arbeitsbereich Fernerkundung und Waldinventur.

Sommersemester Prof. Dr. Christoph Kleinn Institut für Waldinventur und Waldwachstum Arbeitsbereich Fernerkundung und Waldinventur. Basics of Remote Sensing Some literature references Franklin, SE 2001 Remote Sensing for Sustainable Forest Management Lewis Publishers 407p Lillesand, Kiefer 2000 Remote Sensing and Image Interpretation

More information

RADAR (RAdio Detection And Ranging)

RADAR (RAdio Detection And Ranging) RADAR (RAdio Detection And Ranging) CLASSIFICATION OF NONPHOTOGRAPHIC REMOTE SENSORS PASSIVE ACTIVE DIGITAL CAMERA THERMAL (e.g. TIMS) VIDEO CAMERA MULTI- SPECTRAL SCANNERS VISIBLE & NIR MICROWAVE Real

More information

REMOTE SENSING. Topic 10 Fundamentals of Digital Multispectral Remote Sensing MULTISPECTRAL SCANNERS MULTISPECTRAL SCANNERS

REMOTE SENSING. Topic 10 Fundamentals of Digital Multispectral Remote Sensing MULTISPECTRAL SCANNERS MULTISPECTRAL SCANNERS REMOTE SENSING Topic 10 Fundamentals of Digital Multispectral Remote Sensing Chapter 5: Lillesand and Keifer Chapter 6: Avery and Berlin MULTISPECTRAL SCANNERS Record EMR in a number of discrete portions

More information

Lecture 2. Electromagnetic radiation principles. Units, image resolutions.

Lecture 2. Electromagnetic radiation principles. Units, image resolutions. NRMT 2270, Photogrammetry/Remote Sensing Lecture 2 Electromagnetic radiation principles. Units, image resolutions. Tomislav Sapic GIS Technologist Faculty of Natural Resources Management Lakehead University

More information

Radar Imaging Wavelengths

Radar Imaging Wavelengths A Basic Introduction to Radar Remote Sensing ~~~~~~~~~~ Rev. Ronald J. Wasowski, C.S.C. Associate Professor of Environmental Science University of Portland Portland, Oregon 3 November 2015 Radar Imaging

More information

Introduction to Remote Sensing Part 1

Introduction to Remote Sensing Part 1 Introduction to Remote Sensing Part 1 A Primer on Electromagnetic Radiation Digital, Multi-Spectral Imagery The 4 Resolutions Displaying Images Corrections and Enhancements Passive vs. Active Sensors Radar

More information

Active and Passive Microwave Remote Sensing

Active and Passive Microwave Remote Sensing Active and Passive Microwave Remote Sensing Passive remote sensing system record EMR that was reflected (e.g., blue, green, red, and near IR) or emitted (e.g., thermal IR) from the surface of the Earth.

More information

Chapters 1-3. Chapter 1: Introduction and applications of photogrammetry Chapter 2: Electro-magnetic radiation. Chapter 3: Basic optics

Chapters 1-3. Chapter 1: Introduction and applications of photogrammetry Chapter 2: Electro-magnetic radiation. Chapter 3: Basic optics Chapters 1-3 Chapter 1: Introduction and applications of photogrammetry Chapter 2: Electro-magnetic radiation Radiation sources Classification of remote sensing systems (passive & active) Electromagnetic

More information

Conceptual Physics Fundamentals

Conceptual Physics Fundamentals Conceptual Physics Fundamentals Chapter 13: LIGHT WAVES This lecture will help you understand: Electromagnetic Spectrum Transparent and Opaque Materials Color Why the Sky is Blue, Sunsets are Red, and

More information

JP Stevens High School: Remote Sensing

JP Stevens High School: Remote Sensing 1 Name(s): ANSWER KEY Date: Team name: JP Stevens High School: Remote Sensing Scoring: Part I - /18 Part II - /40 Part III - /16 Part IV - /14 Part V - /93 Total: /181 2 I. History (3 pts each) 1. What

More information

Introduction to Remote Sensing. Electromagnetic Energy. Data From Wave Phenomena. Electromagnetic Radiation (EMR) Electromagnetic Energy

Introduction to Remote Sensing. Electromagnetic Energy. Data From Wave Phenomena. Electromagnetic Radiation (EMR) Electromagnetic Energy A Basic Introduction to Remote Sensing (RS) ~~~~~~~~~~ Rev. Ronald J. Wasowski, C.S.C. Associate Professor of Environmental Science University of Portland Portland, Oregon 1 September 2015 Introduction

More information

Aerial photography and Remote Sensing. Bikini Atoll, 2013 (60 years after nuclear bomb testing)

Aerial photography and Remote Sensing. Bikini Atoll, 2013 (60 years after nuclear bomb testing) Aerial photography and Remote Sensing Bikini Atoll, 2013 (60 years after nuclear bomb testing) Computers have linked mapping techniques under the umbrella term : Geomatics includes all the following spatial

More information

EE 529 Remote Sensing Techniques. Introduction

EE 529 Remote Sensing Techniques. Introduction EE 529 Remote Sensing Techniques Introduction Course Contents Radar Imaging Sensors Imaging Sensors Imaging Algorithms Imaging Algorithms Course Contents (Cont( Cont d) Simulated Raw Data y r Processing

More information

SFR 406 Spring 2015 Lecture 7 Notes Film Types and Filters

SFR 406 Spring 2015 Lecture 7 Notes Film Types and Filters SFR 406 Spring 2015 Lecture 7 Notes Film Types and Filters 1. Film Resolution Introduction Resolution relates to the smallest size features that can be detected on the film. The resolving power is a related

More information

Acknowledgment. Process of Atmospheric Radiation. Atmospheric Transmittance. Microwaves used by Radar GMAT Principles of Remote Sensing

Acknowledgment. Process of Atmospheric Radiation. Atmospheric Transmittance. Microwaves used by Radar GMAT Principles of Remote Sensing GMAT 9600 Principles of Remote Sensing Week 4 Radar Background & Surface Interactions Acknowledgment Mike Chang Natural Resources Canada Process of Atmospheric Radiation Dr. Linlin Ge and Prof Bruce Forster

More information

Introduction to Remote Sensing Fundamentals of Satellite Remote Sensing. Mads Olander Rasmussen

Introduction to Remote Sensing Fundamentals of Satellite Remote Sensing. Mads Olander Rasmussen Introduction to Remote Sensing Fundamentals of Satellite Remote Sensing Mads Olander Rasmussen (mora@dhi-gras.com) 01. Introduction to Remote Sensing DHI What is remote sensing? the art, science, and technology

More information

Course overview; Remote sensing introduction; Basics of image processing & Color theory

Course overview; Remote sensing introduction; Basics of image processing & Color theory GEOL 1460 /2461 Ramsey Introduction to Remote Sensing Fall, 2018 Course overview; Remote sensing introduction; Basics of image processing & Color theory Week #1: 29 August 2018 I. Syllabus Review we will

More information

REMOTE SENSING INTERPRETATION

REMOTE SENSING INTERPRETATION REMOTE SENSING INTERPRETATION Jan Clevers Centre for Geo-Information - WU Remote Sensing --> RS Sensor at a distance EARTH OBSERVATION EM energy Earth RS is a tool; one of the sources of information! 1

More information

Introduction Active microwave Radar

Introduction Active microwave Radar RADAR Imaging Introduction 2 Introduction Active microwave Radar Passive remote sensing systems record electromagnetic energy that was reflected or emitted from the surface of the Earth. There are also

More information

LE/ESSE Payload Design

LE/ESSE Payload Design LE/ESSE4360 - Payload Design 3.2 Spacecraft Sensors Introduction to Sensors Earth, Moon, Mars, and Beyond Dr. Jinjun Shan, Professor of Space Engineering Department of Earth and Space Science and Engineering

More information

Lecture Notes Prepared by Prof. J. Francis Spring Remote Sensing Instruments

Lecture Notes Prepared by Prof. J. Francis Spring Remote Sensing Instruments Lecture Notes Prepared by Prof. J. Francis Spring 2005 Remote Sensing Instruments Material from Remote Sensing Instrumentation in Weather Satellites: Systems, Data, and Environmental Applications by Rao,

More information

Remote Sensing Platforms

Remote Sensing Platforms Types of Platforms Lighter-than-air Remote Sensing Platforms Free floating balloons Restricted by atmospheric conditions Used to acquire meteorological/atmospheric data Blimps/dirigibles Major role - news

More information

Remote Sensing 1 Principles of visible and radar remote sensing & sensors

Remote Sensing 1 Principles of visible and radar remote sensing & sensors Remote Sensing 1 Principles of visible and radar remote sensing & sensors Nick Barrand School of Geography, Earth & Environmental Sciences University of Birmingham, UK Field glaciologist collecting data

More information

Blacksburg, VA July 24 th 30 th, 2010 Remote Sensing Page 1. A condensed overview. For our purposes

Blacksburg, VA July 24 th 30 th, 2010 Remote Sensing Page 1. A condensed overview. For our purposes A condensed overview George McLeod Prepared by: With support from: NSF DUE-0903270 in partnership with: Geospatial Technician Education Through Virginia s Community Colleges (GTEVCC) The art and science

More information

Chapter 5. Preprocessing in remote sensing

Chapter 5. Preprocessing in remote sensing Chapter 5. Preprocessing in remote sensing 5.1 Introduction Remote sensing images from spaceborne sensors with resolutions from 1 km to < 1 m become more and more available at reasonable costs. For some

More information

Outline for today. Geography 411/611 Remote sensing: Principles and Applications. Remote sensing: RS for biogeochemical cycles

Outline for today. Geography 411/611 Remote sensing: Principles and Applications. Remote sensing: RS for biogeochemical cycles Geography 411/611 Remote sensing: Principles and Applications Thomas Albright, Associate Professor Laboratory for Conservation Biogeography, Department of Geography & Program in Ecology, Evolution, & Conservation

More information

Int n r t o r d o u d c u ti t on o n to t o Remote Sensing

Int n r t o r d o u d c u ti t on o n to t o Remote Sensing Introduction to Remote Sensing Definition of Remote Sensing Remote sensing refers to the activities of recording/observing/perceiving(sensing)objects or events at far away (remote) places. In remote sensing,

More information

Mod. 2 p. 1. Prof. Dr. Christoph Kleinn Institut für Waldinventur und Waldwachstum Arbeitsbereich Fernerkundung und Waldinventur

Mod. 2 p. 1. Prof. Dr. Christoph Kleinn Institut für Waldinventur und Waldwachstum Arbeitsbereich Fernerkundung und Waldinventur Histograms of gray values for TM bands 1-7 for the example image - Band 4 and 5 show more differentiation than the others (contrast=the ratio of brightest to darkest areas of a landscape). - Judging from

More information

Chapter 8. Remote sensing

Chapter 8. Remote sensing 1. Remote sensing 8.1 Introduction 8.2 Remote sensing 8.3 Resolution 8.4 Landsat 8.5 Geostationary satellites GOES 8.1 Introduction What is remote sensing? One can describe remote sensing in different

More information

Chapters 1-3. Chapter 1: Introduction and applications of photogrammetry Chapter 2: Electro-magnetic radiation. Chapter 3: Basic optics

Chapters 1-3. Chapter 1: Introduction and applications of photogrammetry Chapter 2: Electro-magnetic radiation. Chapter 3: Basic optics Chapters 1-3 Chapter 1: Introduction and applications of photogrammetry Chapter 2: Electro-magnetic radiation Radiation sources Classification of remote sensing systems (passive & active) Electromagnetic

More information

MODULE 9 LECTURE NOTES 1 PASSIVE MICROWAVE REMOTE SENSING

MODULE 9 LECTURE NOTES 1 PASSIVE MICROWAVE REMOTE SENSING MODULE 9 LECTURE NOTES 1 PASSIVE MICROWAVE REMOTE SENSING 1. Introduction The microwave portion of the electromagnetic spectrum involves wavelengths within a range of 1 mm to 1 m. Microwaves possess all

More information

NON-PHOTOGRAPHIC SYSTEMS: Multispectral Scanners Medium and coarse resolution sensor comparisons: Landsat, SPOT, AVHRR and MODIS

NON-PHOTOGRAPHIC SYSTEMS: Multispectral Scanners Medium and coarse resolution sensor comparisons: Landsat, SPOT, AVHRR and MODIS NON-PHOTOGRAPHIC SYSTEMS: Multispectral Scanners Medium and coarse resolution sensor comparisons: Landsat, SPOT, AVHRR and MODIS CLASSIFICATION OF NONPHOTOGRAPHIC REMOTE SENSORS PASSIVE ACTIVE DIGITAL

More information

Some Basic Concepts of Remote Sensing. Lecture 2 August 31, 2005

Some Basic Concepts of Remote Sensing. Lecture 2 August 31, 2005 Some Basic Concepts of Remote Sensing Lecture 2 August 31, 2005 What is remote sensing Remote Sensing: remote sensing is science of acquiring, processing, and interpreting images and related data that

More information

Light and Applications of Optics

Light and Applications of Optics UNIT 4 Light and Applications of Optics Topic 4.1: What is light and how is it produced? Topic 4.6: What are lenses and what are some of their applications? Topic 4.2 : How does light interact with objects

More information

Final Examination Introduction to Remote Sensing. Time: 1.5 hrs Max. Marks: 50. Section-I (50 x 1 = 50 Marks)

Final Examination Introduction to Remote Sensing. Time: 1.5 hrs Max. Marks: 50. Section-I (50 x 1 = 50 Marks) Final Examination Introduction to Remote Sensing Time: 1.5 hrs Max. Marks: 50 Note: Attempt all questions. Section-I (50 x 1 = 50 Marks) 1... is the technology of acquiring information about the Earth's

More information

Passive Microwave Sensors LIDAR Remote Sensing Laser Altimetry. 28 April 2003

Passive Microwave Sensors LIDAR Remote Sensing Laser Altimetry. 28 April 2003 Passive Microwave Sensors LIDAR Remote Sensing Laser Altimetry 28 April 2003 Outline Passive Microwave Radiometry Rayleigh-Jeans approximation Brightness temperature Emissivity and dielectric constant

More information

10 Radar Imaging Radar Imaging

10 Radar Imaging Radar Imaging 10 Radar Imaging Active sensors provide their own source of energy to illuminate the target. Active sensors are generally divided into two distinct categories: imaging and non-imaging. The most common

More information

Introduction to Remote Sensing

Introduction to Remote Sensing Introduction to Remote Sensing Daniel McInerney Urban Institute Ireland, University College Dublin, Richview Campus, Clonskeagh Drive, Dublin 14. 16th June 2009 Presentation Outline 1 2 Spaceborne Sensors

More information

Introduction to Remote Sensing

Introduction to Remote Sensing Introduction to Remote Sensing Outline Remote Sensing Defined Resolution Electromagnetic Energy (EMR) Types Interpretation Applications Remote Sensing Defined Remote Sensing is: The art and science of

More information

High Resolution Multi-spectral Imagery

High Resolution Multi-spectral Imagery High Resolution Multi-spectral Imagery Jim Baily, AirAgronomics AIRAGRONOMICS Having been involved in broadacre agriculture until 2000 I perceived a need for a high resolution remote sensing service to

More information

Lecture 6 6 Color, Waves, and Dispersion Reading Assignment: Read Kipnis Chapter 7 Colors, Section I, II, III 6.1 Overview and History

Lecture 6 6 Color, Waves, and Dispersion Reading Assignment: Read Kipnis Chapter 7 Colors, Section I, II, III 6.1 Overview and History Lecture 6 6 Color, Waves, and Dispersion Reading Assignment: Read Kipnis Chapter 7 Colors, Section I, II, III 6.1 Overview and History In Lecture 5 we discussed the two different ways of talking about

More information

LlIGHT REVIEW PART 2 DOWNLOAD, PRINT and submit for 100 points

LlIGHT REVIEW PART 2 DOWNLOAD, PRINT and submit for 100 points WRITE ON SCANTRON WITH NUMBER 2 PENCIL DO NOT WRITE ON THIS TEST LlIGHT REVIEW PART 2 DOWNLOAD, PRINT and submit for 100 points Multiple Choice Identify the choice that best completes the statement or

More information

earthobservation.wordpress.com

earthobservation.wordpress.com Dirty REMOTE SENSING earthobservation.wordpress.com Stuart Green Teagasc Stuart.Green@Teagasc.ie 1 Purpose Give you a very basic skill set and software training so you can: find free satellite image data.

More information

RADAR REMOTE SENSING

RADAR REMOTE SENSING RADAR REMOTE SENSING Jan G.P.W. Clevers & Steven M. de Jong Chapter 8 of L&K 1 Wave theory for the EMS: Section 1.2 of L&K E = electrical field M = magnetic field c = speed of light : propagation direction

More information

Synthetic aperture RADAR (SAR) principles/instruments October 31, 2018

Synthetic aperture RADAR (SAR) principles/instruments October 31, 2018 GEOL 1460/2461 Ramsey Introduction to Remote Sensing Fall, 2018 Synthetic aperture RADAR (SAR) principles/instruments October 31, 2018 I. Reminder: Upcoming Dates lab #2 reports due by the start of next

More information

Application of GIS to Fast Track Planning and Monitoring of Development Agenda

Application of GIS to Fast Track Planning and Monitoring of Development Agenda Application of GIS to Fast Track Planning and Monitoring of Development Agenda Radiometric, Atmospheric & Geometric Preprocessing of Optical Remote Sensing 13 17 June 2018 Outline 1. Why pre-process remotely

More information

Introduction to Microwave Remote Sensing

Introduction to Microwave Remote Sensing Introduction to Microwave Remote Sensing lain H. Woodhouse The University of Edinburgh Scotland Taylor & Francis Taylor & Francis Group Boca Raton London New York A CRC title, part of the Taylor & Francis

More information

UNERSITY OF NAIROBI UNIT: PRICIPLES AND APPLICATIONS OF REMOTE SENSING AND APLLIED CLIMATOLOGY

UNERSITY OF NAIROBI UNIT: PRICIPLES AND APPLICATIONS OF REMOTE SENSING AND APLLIED CLIMATOLOGY UNERSITY OF NAIROBI DEPARTMENT OF METEOROLOGY UNIT: PRICIPLES AND APPLICATIONS OF REMOTE SENSING AND APLLIED CLIMATOLOGY COURSE CODE: SMR 308 GROUP TWO: SENSORS MEMBERS OF GROUP TWO 1. MUTISYA J.M I10/2784/2006

More information

Monitoring agricultural plantations with remote sensing imagery

Monitoring agricultural plantations with remote sensing imagery MPRA Munich Personal RePEc Archive Monitoring agricultural plantations with remote sensing imagery Camelia Slave and Anca Rotman University of Agronomic Sciences and Veterinary Medicine - Bucharest Romania,

More information

GIS Data Collection. Remote Sensing

GIS Data Collection. Remote Sensing GIS Data Collection Remote Sensing Data Collection Remote sensing Introduction Concepts Spectral signatures Resolutions: spectral, spatial, temporal Digital image processing (classification) Other systems

More information

Remote Sensing Platforms

Remote Sensing Platforms Remote Sensing Platforms Remote Sensing Platforms - Introduction Allow observer and/or sensor to be above the target/phenomena of interest Two primary categories Aircraft Spacecraft Each type offers different

More information

Philpot & Philipson: Remote Sensing Fundamentals Color 6.1 W.D. Philpot, Cornell University, Fall 2012 W B = W (R + G) R = W (G + B)

Philpot & Philipson: Remote Sensing Fundamentals Color 6.1 W.D. Philpot, Cornell University, Fall 2012 W B = W (R + G) R = W (G + B) Philpot & Philipson: Remote Sensing Fundamentals olor 6.1 6. OLOR The human visual system is capable of distinguishing among many more colors than it is levels of gray. The range of color perception is

More information

Introduction to Radar

Introduction to Radar National Aeronautics and Space Administration ARSET Applied Remote Sensing Training http://arset.gsfc.nasa.gov @NASAARSET Introduction to Radar Jul. 16, 2016 www.nasa.gov Objective The objective of this

More information

Making NDVI Images using the Sony F717 Nightshot Digital Camera and IR Filters and Software Created for Interpreting Digital Images.

Making NDVI Images using the Sony F717 Nightshot Digital Camera and IR Filters and Software Created for Interpreting Digital Images. Making NDVI Images using the Sony F717 Nightshot Digital Camera and IR Filters and Software Created for Interpreting Digital Images Draft 1 John Pickle Museum of Science October 14, 2004 Digital Cameras

More information

(Refer Slide Time: 1:28)

(Refer Slide Time: 1:28) Introduction to Remote Sensing Dr. Arun K Saraf Department of Earth Sciences Indian Institute of Technology Roorkee Lecture 10 Image characteristics and different resolutions in Remote Sensing Hello everyone,

More information

A map says to you, 'Read me carefully, follow me closely, doubt me not.' It says, 'I am the Earth in the palm of your hand. Without me, you are alone

A map says to you, 'Read me carefully, follow me closely, doubt me not.' It says, 'I am the Earth in the palm of your hand. Without me, you are alone A map says to you, 'Read me carefully, follow me closely, doubt me not.' It says, 'I am the Earth in the palm of your hand. Without me, you are alone and lost. Beryl Markham (West With the Night, 1946

More information

Active and Passive Microwave Remote Sensing

Active and Passive Microwave Remote Sensing Active and Passive Microwave Remote Sensing Passive remote sensing system record EMR that was reflected (e.g., blue, green, red, and near IR) or emitted (e.g., thermal IR) from the surface of the Earth.

More information

Remote Sensing for Rangeland Applications

Remote Sensing for Rangeland Applications Remote Sensing for Rangeland Applications Jay Angerer Ecological Training June 16, 2012 Remote Sensing The term "remote sensing," first used in the United States in the 1950s by Ms. Evelyn Pruitt of the

More information

Fig Color spectrum seen by passing white light through a prism.

Fig Color spectrum seen by passing white light through a prism. 1. Explain about color fundamentals. Color of an object is determined by the nature of the light reflected from it. When a beam of sunlight passes through a glass prism, the emerging beam of light is not

More information

Section 1: Sound. Sound and Light Section 1

Section 1: Sound. Sound and Light Section 1 Sound and Light Section 1 Section 1: Sound Preview Key Ideas Bellringer Properties of Sound Sound Intensity and Decibel Level Musical Instruments Hearing and the Ear The Ear Ultrasound and Sonar Sound

More information

RADIOMETRIC CALIBRATION

RADIOMETRIC CALIBRATION 1 RADIOMETRIC CALIBRATION Lecture 10 Digital Image Data 2 Digital data are matrices of digital numbers (DNs) There is one layer (or matrix) for each satellite band Each DN corresponds to one pixel 3 Digital

More information

How can we "see" using the Infrared?

How can we see using the Infrared? The Infrared Infrared light lies between the visible and microwave portions of the electromagnetic spectrum. Infrared light has a range of wavelengths, just like visible light has wavelengths that range

More information

Exercises The Color Spectrum (pages ) 28.2 Color by Reflection (pages )

Exercises The Color Spectrum (pages ) 28.2 Color by Reflection (pages ) Exercises 28.1 The Spectrum (pages 555 556) 1. was the first person to do a systematic study of color. 2. Circle the letter of each statement that is true about Newton s study of color. a. He studied sunlight.

More information

Remote Sensing. The following figure is grey scale display of SPOT Panchromatic without stretching.

Remote Sensing. The following figure is grey scale display of SPOT Panchromatic without stretching. Remote Sensing Objectives This unit will briefly explain display of remote sensing image, geometric correction, spatial enhancement, spectral enhancement and classification of remote sensing image. At

More information

Figure 1: Percent reflectance for various features, including the five spectra from Table 1, at different wavelengths from 0.4µm to 1.4µm.

Figure 1: Percent reflectance for various features, including the five spectra from Table 1, at different wavelengths from 0.4µm to 1.4µm. Section 1: The Electromagnetic Spectrum 1. The wavelength range that has the highest reflectance for broadleaf vegetation and needle leaf vegetation is 0.75µm to 1.05µm. 2. Dry soil can be distinguished

More information

Human Retina. Sharp Spot: Fovea Blind Spot: Optic Nerve

Human Retina. Sharp Spot: Fovea Blind Spot: Optic Nerve I am Watching YOU!! Human Retina Sharp Spot: Fovea Blind Spot: Optic Nerve Human Vision Optical Antennae: Rods & Cones Rods: Intensity Cones: Color Energy of Light 6 10 ev 10 ev 4 1 2eV 40eV KeV MeV Energy

More information

AR M. Sc. (Rural Technology) II Semester Fundamental of Remote Sensing Model Paper

AR M. Sc. (Rural Technology) II Semester Fundamental of Remote Sensing Model Paper 1. Multiple choice question ; AR- 7251 M. Sc. (Rural Technology) II Semester Fundamental of Remote Sensing Model Paper 1. Chlorophyll strongly absorbs radition of : (b) Red and Blue wavelength (ii) Which

More information

Conceptual Physics 11 th Edition

Conceptual Physics 11 th Edition Conceptual Physics 11 th Edition Chapter 27: COLOR This lecture will help you understand: Color in Our World Selective Reflection Selective Transmission Mixing Colored Light Mixing Colored Pigments Why

More information

Topic 1 - What is Light? 1. Radiation is the type of energy transfer which does not require... A matter B heat C waves D light

Topic 1 - What is Light? 1. Radiation is the type of energy transfer which does not require... A matter B heat C waves D light Grade 8 Unit 1 Test Student Class Topic 1 - What is Light? 1. Radiation is the type of energy transfer which does not require... A matter B heat C waves D light 2. Light-producing technologies, such as

More information

Remote sensing image correction

Remote sensing image correction Remote sensing image correction Introductory readings remote sensing http://www.microimages.com/documentation/tutorials/introrse.pdf 1 Preprocessing Digital Image Processing of satellite images can be

More information

Sample Copy. Not For Distribution.

Sample Copy. Not For Distribution. Photogrammetry, GIS & Remote Sensing Quick Reference Book i EDUCREATION PUBLISHING Shubham Vihar, Mangla, Bilaspur, Chhattisgarh - 495001 Website: www.educreation.in Copyright, 2017, S.S. Manugula, V.

More information

Remote Sensing. Odyssey 7 Jun 2012 Benjamin Post

Remote Sensing. Odyssey 7 Jun 2012 Benjamin Post Remote Sensing Odyssey 7 Jun 2012 Benjamin Post Definitions Applications Physics Image Processing Classifiers Ancillary Data Data Sources Related Concepts Outline Big Picture Definitions Remote Sensing

More information

Saturation And Value Modulation (SVM): A New Method For Integrating Color And Grayscale Imagery

Saturation And Value Modulation (SVM): A New Method For Integrating Color And Grayscale Imagery 87 Saturation And Value Modulation (SVM): A New Method For Integrating Color And Grayscale Imagery By David W. Viljoen 1 and Jeff R. Harris 2 Geological Survey of Canada 615 Booth St. Ottawa, ON, K1A 0E9

More information

Image interpretation I and II

Image interpretation I and II Image interpretation I and II Looking at satellite image, identifying different objects, according to scale and associated information and to communicate this information to others is what we call as IMAGE

More information

Co-ReSyF RA lecture: Vessel detection and oil spill detection

Co-ReSyF RA lecture: Vessel detection and oil spill detection This project has received funding from the European Union s Horizon 2020 Research and Innovation Programme under grant agreement no 687289 Co-ReSyF RA lecture: Vessel detection and oil spill detection

More information

Chapter 9: Light, Colour and Radiant Energy. Passed a beam of white light through a prism.

Chapter 9: Light, Colour and Radiant Energy. Passed a beam of white light through a prism. Chapter 9: Light, Colour and Radiant Energy Where is the colour in sunlight? In the 17 th century (1600 s), Sir Isaac Newton conducted a famous experiment. Passed a beam of white light through a prism.

More information

Light, Color, Spectra 05/30/2006. Lecture 17 1

Light, Color, Spectra 05/30/2006. Lecture 17 1 What do we see? Light Our eyes can t t detect intrinsic light from objects (mostly infrared), unless they get red hot The light we see is from the sun or from artificial light When we see objects, we see

More information

Remote Sensing in Daily Life. What Is Remote Sensing?

Remote Sensing in Daily Life. What Is Remote Sensing? Remote Sensing in Daily Life What Is Remote Sensing? First time term Remote Sensing was used by Ms Evelyn L Pruitt, a geographer of US in mid 1950s. Minimal definition (not very useful): remote sensing

More information