Philpot & Philipson: Remote Sensing Fundamentals Scanners 8.1 W.D. Philpot, Cornell University, Fall 2015
|
|
- Vivien Blair
- 5 years ago
- Views:
Transcription
1 Philpot & Philipson: Remote Sensing Fundamentals Scanners SCANNERS 8.1 General Scanners are scanning radiometers which, when operated from an airborne or spaceborne platform, image the terrain in one or more spectral bands. Mechanical scanners are electro-opticalmechanical sensors that use a mirror or prism to focus radiation of ultraviolet, visible or infrared wavelengths, or combinations of these wavelengths, from the ground to one or more detectors. (Microwave scanners will be described in a later section.) The mirror or prism rotates or oscillates, scanning a line of incoming radiation that is usually perpendicular to the flight direction (Figure 8.1). New, adjacent lines are scanned as the platform moves ahead. Each line is typically recorded digitally and sometimes displayed directly on a monitor. Figure 8.1:Data collection pattern for a mechanical (whiskbroom) scanner. Recall that the aircraft or spacecraft is moving ahead during the time that each line is imaged. For mechanically scanning systems this means that the pixel the end of the scan line will be imaged later than the pixel at the beginning of the scan line and therefore be will displaced along the flight path. The displacement is usually rather small but not always entirely negligible. More troublesome is the geometric distortion introduced as a result of the variation in viewing angle. This geometric distortion is especially severe near the edges of the image where the sizes of the ground resolution elements are comparatively larger than those at nadir (Figure 8.2). For a square pixel at nadir, the width of a single scan line on the ground is Hω. At a viewing angle, θ, the along-scan-line width of a pixel is Hω sec 2 θ while the length along the flight line is ω sec θ. The distortion is called the panoramic effect. Panoramic distortion can be reduced to negligible amounts by restricting the total FOV in relation to the flight height.
2 Philpot & Philipson: Remote Sensing Fundamentals Scanners 8.2 Figure 8.2: Panoramic effect the increase in the GIFOV with viewing angle. 8.2 Scanning a Ground Resolution Element The spatial parameters of scanning which are of primary concern for remote sensing are the scanner's total field-of-view (FOV), its instantaneous field-of-view (IFOV or ω), and the ratio of the platform's velocity to its height above ground (V/H). The total FOV of a scanner is the scanner's lateral coverage (Figure 8.1). It is analogous to the angular coverage of an aerial camera in the direction perpendicular to the aircraft's flight direction. Unlike a camera, however, a scanner does not collect radiation over its total FOV at one instant of time. As noted, the scanner mirror covers the total FOV by "looking" sequentially at numerous ground spots, or resolution elements, along a scan line (Figure 8.1). These resolution elements are the smallest ground areas that the scanner can "see"; radiance emanating from all features within each ground resolution element are integrated and measured as a single level of radiation for each wavelength interval sensed by the scanner's detector or detectors. The ground resolution element viewed instantaneously by the scanner will be measured and recorded in the resultant image as a single picture element or "pixel". Ideally, each pixel corresponds to the ground resolution element defined by the IFOV. When expressed in terms of the ground distance covered by a single pixel, the instantaneous field of view is typically expressed in units of length and referred to as the Ground Instantaneous Field of View (GIFOV). Note that the GIFOV is dependent on look angle, while the angular IFOV is not. As shown in Figure 8.1, the size of the ground resolution element is determined by the scanner's IFOV, ω, its height above ground, H, and the scan angle b, the angle between the ground element and the nadir. To illustrate, a typical aircraft scanner might have a square IFOV of 2.5 milliradians on a side, and a total FOV of 120, or 60 to either side of the aircraft. If the length or width of the ground resolution element is approximated by the arc of a circle whose center is at the scanner, an angle of 2.5 milliradians would intercept an arch whose length is times the radius. For every 1,000 meters of aircraft height above ground, the dimensions of the ground spot viewed directly below the aircraft would increase by 2.5 meters on a side (R = H = 1,000). The corresponding increase for ground resolution elements away from the aircraft nadir would be larger because the distance between the scanner and element (i. e., the radius of the circle) is longer. For every increase of 1,000 meters above ground, other resolution elements would increase by 2.5/cos 2 θ along a scan line and by 2.5/cos θ perpendicular to the scan line.
3 Philpot & Philipson: Remote Sensing Fundamentals Scanners Detector-Scanner-Platform Velocity Relationships The usual goal of creating a digital image is to collect pixels in a square array in which adjacent pixels are contiguous, but not overlapping. Since with a scanner this is done on the fly, the specific geometry of the image depends on the rate of scanning. That rate is constrained, in turn by the detector response. While data are collected for a single pixel, there is both motion of the scanner and motion of the platform. As a consequence, single pixels do not correspond exactly to a ground resolution element defined by the IFOV. Moreover, although single pixels correspond to single ground resolution elements, there may be gaps or overlaps between adjacent pixels if the rate of detector sampling and the rate of scanner rotation are not well matched. It is instructive to examine these relationships further. Consider an aircraft scanner that scans the terrain with a rotating mirror through an IFOV of ω. Assuming that the mirror rotates through 360 degrees or 2p radians, the number of elements scanned per mirror rotation is 2p divided by ω, the IFOV in the direction of scanning (Figure 8.1). The number of resolution elements (N) scanned per second is thus equal to the number of elements scanned per mirror rotation multiplied by the rate of mirror rotation (M), or N = M(2p/ω ). Since the time required to scan a single resolution element is 1/N, 12 scan time per element = ω 2 p M (8.1) The time devoted to scanning a resolution element must be at least as long as it takes the detector to respond. The detector's dwell time refers to the time the detector must "look at," or dwell on, a resolution element before it "sees" it. In general, Detector dwell time per element = kt d (8.2) where k is a constant of 1.0 or more, and t d is the time constant for the detector (i.e., time required for the detector to respond). For each ground resolution element to be "seen," ω kt d 2π M (8.3) The remaining variables to be considered are the velocity, V, and height, H, of the aircraft. It should be clear that, if the aircraft is moving too rapidly, there will be gaps in the ground coverage. The velocity of the aircraft must therefore be linked with the rate of scanning. In the flight direction at the nadir, the width of a single scan line is H multiplied by ω, the IFOV in the flight direction (Figure 8.1); the width of the ground strip covered each second is M(Hω ). To avoid gaps or underlap between scan lines, M(Hω ) > V (8.4) or V M > (8.5) Hω The rate of mirror rotation is thus governed, in part, by the velocity to height ratio, and vice versa. (Note that overlap between adjacent scan lines produces redundant information.) The constraints can be combined by substituting for M in Equation 10.3, which results in the following:
4 Philpot & Philipson: Remote Sensing Fundamentals Scanners 8.4 ω V 2πkt H d (8.6) As indicated by Equation 8.6, the spatial resolution can be increased by decreasing t d, with no change in V or H, if the speed of the detector can be increased. Similarly, flying faster or lower requires a faster detector for the same ω. The relation between velocity and height, the V/H ratio, describes the angular rate (radians/second) that a point on the ground appears to pass beneath the aircraft. For a given scanner configuration, ω and t d are fixed, but the V/H ratio can be adjusted to some defined maximum value. 8.4 Power Considerations The level of signal received from a resolution element has obvious effects on the resultant scanner image. Irradiance at the scanner can be described by E = L tot ω (8.7) where L tot = (L emit + L ref ) t a + L a ω = instantaneous FOV L emit = total emitted radiance L ref = total reflected radiance t a = atmospheric transmissivity L a = path radiance As Ad For simple scanners, ω= = (8.8) 2 2 R f where As = area of ground resolution element of scene R = distance between scanner and ground element Ad = area of detector = (Do2)/4 Do = effective diameter of aperture f = focal length of scanner. The power received at the detector is: P = E A o t o (8.9) where E = irradiance at the detector A o = area of scanner aperture t o = transmission of the optical system The detector responds to the power received by producing a signal, V s = PR (8.10) where R = responsivity = V n D * /(A d B) V n = voltage generated by detector noise D* = D-star, a figure of merit of the detector B = electrical bandwidth
5 Philpot & Philipson: Remote Sensing Fundamentals Scanners 8.5 The signal to noise ratio is one means of assessing the information transferred by the power received. Rearranging Equation 8.10, V V s n PD = (8.11) A B d Substituting for P = L tot ω A o t o and A d = ω/f 2, assuming a square instantaneous FOV whereby ω = (ω ) (ω ), and letting F = the f-number of the scanner optics = f/d o, results in s n V ω DtDL o o = (8.12) V 4f (B ) In order for the scanner to achieve an angular resolution of ω all parts of the system must be able to accommodate an electrical bandwidth of B. A nominal value for B is 1/[2(detector dwell time)] or, from equation 8.2, B = 1/[2(kt d )]. Substituting from Equation 8.6, n oτo Vs ωd DL = V 4f π (V / H) (8.13) While equation 8.13 has many variables, for a given scanner configuration, one can define a system constant, oτo ωd DL C = 4fπ (8.14) then Vs CL = (8.15) V (V / H) n With C defined for the scanner optical system, the only variables involved in assessing a signal to noise ratio are L, the level of radiance from the ground, and the V/H ratio Designs to increase detector dwell time It should be clear from the above discussion that improving the spatial resolution of the scanning system will decrease the dwell time for a given design. Two possible designs are shown schematically in Figure 8.3. The simplest change is to simply add more detectors for adjacent scan lines (Figure 8.3a). Since a single detector is no longer required to collect every scan line, the collection time may be increase proportional to the number of lines scanned in a single pass. A difficulty with this design is that each detector will have a slightly different gain and offset and will have to be corrected separately. A second alternative is illustrated in (Figure 8.3b). In this case the detector is replaced by a linear CCD array. This removes the requirement for scanning entirely, since each detector can view a separate pixel on the scan line. It also increases the dwell time by several orders of magnitude. The disadvantage of this design is again related to the variable calibration of the individual detectors
6 Philpot & Philipson: Remote Sensing Fundamentals Scanners 8.6 in the array. However, this seems a small price to pay for eliminating the scanning mechanism and greatly increasing the dwell time. a. b. Figure 8.3: Detector designs to increase individual detector dwell time: a) Design using multiple detectors, one for each scan line; b) CCD array detector. 8.7 Color scanners Collection color information raises further issues of timing and registration. Consider again, the simple situation of a scanner that has a single detector for each spectral band. A diagram of such a scanner is shown in Figure 8.4. In this configuration, each detector looks at a separate area on the ground at one time. The areas viewed are a function of the placement of the detectors in the focal plane of the detector. Here they are in a line along the scan direction. It is clear from the figure that, for a single ground location, each band is collected sequentially and registration is a function of accurate timing. Figure 8.4: Diagram of color scanner with one detector per band. As with the single band case, multiple detectors and CCD arrays can be employed to expand the spectral range of the scanner. Two arrangements are illustrated in Figure 8.5. Both are variations on the single-band design and both have the same advantages and limitations as the single-band counterparts. Registration of spectral band is still an issue in both cases. With the CCD design, registration along the scan line is typically near perfect; any misregistration will occur in the flight direction.
7 Philpot & Philipson: Remote Sensing Fundamentals Scanners 8.7 a. b. Figure 8.5: Schematic drawings of two detector designs for sensing multiple spectral bands: a) design using separate detectors for each scan line and multiple detectors on each scan line to image in multiple spectral bands; b) design using a separate linear CCD array for each spectral band. Figure 8.6: Hyperspectral imaging using a 2-D CCD array. If the design is extended to collect full spectra (e.g., hyperspectral data) another possibility is to replace the linear CCD arrays with a 2-dimensional CCD array. This configuration is illustrated in Figure 8.6. One dimension of the array is spectral, collecting an entire spectrum for one sample in the scan; the other is spatial, collecting data for the full scan line. An advantage of this design is that spectral data truly come from a single ground resolution element. Light from a single pixel entering the optical system is dispersed across the along-track dimension of the array, usually with a diffraction grating. Lines are then collected sequentially as with the linear detector in Figure 8.3b. 8.8 Pointable Satellites The Landsat series of satellites, and most other mapping satellite systems, were designed with the intent of imaging the entire earth repeatedly with a fixed, nadir viewing optical system. The choice was typically between polar orbiting systems that would either provide repeat coverage on a 16-18
8 Philpot & Philipson: Remote Sensing Fundamentals Scanners 8.8 day cycle at a moderate resolution ( m), or with a 2-3 day repeat visit at low resolution (~1 km) or geostationary systems that would provide hourly coverage or better at 8-10 km. All of the early systems were nadir-viewing and higher resolution was incompatible with global coverage since it was not feasible to collect high resolution data over a FOV sufficiently wide to insure regular contiguous coverage. In order to insure the possibility of imaging all parts of the globe at high resolution, it was necessary that the system be pointable. Initially the pointing capability was limited to cross-track pointing with the SPOT satellite. More recent systems (IKONOS, GeoEye, QuickBird, Worldview, Pleiades) allow pointing in both the along-track and cross-track directions. The systems are quite agile, being capable of pointing rapidly enough to take multiple images of the same site in one orbit, or to collect multiple sites cross-track within relatively short distances. Figure 8.7: Illustration of the pointing capability of the IKONOS satellite. From Grodecki and Dial, Space Imaging. Such dynamic pointing capability means that it is possible to collect stereo imagery or to collect data of the same location on earth every 1-2 days, as needed. This is an enormous advantage for monitoring disaster sites or any rapidly developing situation. The system must be tasked to collect the imagery, however, and past data for may be quite sparse. Another issue that arises as a result of the pointing capability is the distortion introduced by the pointing. Although the high resolution systems are typically pushbroom scanners, when the system is pointed the pixels suffer the same panoramic distortion that is characteristic of whiskbroom scanners. The difference is that the entire array suffers the same distortion. An important consequence of this is that the native resolution of the image is dependent on the system viewing angle. Standard practice is to report system resolution as the nadir pixel size the
9 Philpot & Philipson: Remote Sensing Fundamentals Scanners 8.9 minimum possible resolution. For example, the nadir ground sampling distance (GSD) the pixel size for the IKONOS sensor is 0.82 m for panchromatic images. At 30 degrees, the GSD is 1 m for panchromatic images. Image products are typically resampled to a standard value (for IKONOS, this is 1 m). The resampling insures consistency among images, standardizes geometric registration and makes it possible to create mosaics and more easily compare images collected at different times, but there is some loss in fidelity that occurs as a result of the resampling. 8.9 Orbital Mechanics The flight path of a satellite is predetermined by its orbit, and one can classify satellites systems broadly based entirely on the choices made for their orbits. The orbital speed of a body, in our case, a satellite, is the speed at which it orbits around the earth. For simplicity, we consider: circular orbits Newton's laws (nothing about energy or momentum) only two objects (the earth and the satellite) need to be considered the mass of the satellite is negligible relative to the mass of the earth In order for a satellite to maintain a stable orbit, the centripetal force, F c, acting to drive the satellite away from the earth, and the gravitational force, F g, attracting the satellite toward the earth, must balance exactly. Given the above simplifying assumptions, the gravitational force is described by the equation: Gmsme Fg = (8.16) 2 r where: m s is the mass of the satellite, m e, is the mass of the earth [ x kg], G is the universal gravitational constant [ m 3 kg -1 s -2 ], and r is the distance from the center of the earth to the satellite = R e + h (radius of the earth plus the altitude of the satellite) R e = 6378 km (average value) Under the same assumptions, the centripetal force, is described by the equation: 2 mv s Fc = (8.17) r Setting F c equal to F g one may then solve for a relationship between the velocity of the satellite in a circular orbit, v c, and its distance from the earth: v Gm r e c = The ground-track velocity is related very simply to the orbital circular velocity by the ratio v g Re = v R + h e c (8.18) (8.19)
10 Philpot & Philipson: Remote Sensing Fundamentals Scanners 8.10 This ground-track velocity, combined with a desired ground-sampling distance (GSD), provides an upper limit for the amount of time available to collect a single scan line worth of data. This maximum dwell time is simply: t = GSD / v (8.20) maxdwell As an example, the Landsat satellites which fly at an altitude of h ~ 800 km (r = R e + h = 7178 km) have an on-orbit velocity of ~7,450 km/s. This corresponds to a ground speed of ~6,620 km/s and a maximum dwell time of ~1 ms for a scan line worth of data. The next issue is to choose a convenient orbital period. For example, it is often useful to require that satellites in a polar orbit pass over the same latitude at the same local time every day. This helps to minimize changes in the sun illumination angle. Another option would be to adjust the orbital period to match the earth's rotation so that a satellite at the equator could be stationary relative to the earth. The orbital period of a satellite can be determined using Kepler's 3rd Law. Johannes Kepler ( ) was concerned with the basic question of describing the motion of the earth about the sun, but his laws apply generally to all satellites. The 3rd law states that the square of the period, T, of satellite about the earth is proportional to the cube of the satellite s mean distance from the earth, or: T r 2 2 Rearranging to solve for the period, T, we have: 3 g 4π = (8.21) Gm e 3 r T = 2π (8.22) Gm As an example, the international space station has an orbital period of 92 minutes (5,520 s). Knowing the period we may then solve for the distance of the satellite from the center of the earth: e 2 1/3 2 1/ s T r = Gm = ( )( ) = 6, 750 e m kg s kg km 2π 2π making the altitude, h, of the space station: h = r - R e = 6770km 6378km h = 372km There are only a few orbits that are consistently used for the bulk of the earth-viewing satellites. The most important of these for remote sensing are the geosynchronous and sun synchronous orbits. A description of these and closely related orbits is provided below. (Adapted from Geosynchronous orbits (GEO) are circular orbits around the Earth having a period of 24 hours. A geosynchronous orbit with an inclination of zero degrees is called a geostationary orbit since a spacecraft in a geostationary orbit appears to hang motionless above one position on the Earth's equator. They are ideal for some types of communication and meteorological satellites, having a field of view that encompasses a full half of the planet. To attain geosynchronous orbit, a spacecraft is first launched into an elliptical orbit with an apogee of 35,786 km (22,236 miles) called a
11 Philpot & Philipson: Remote Sensing Fundamentals Scanners 8.11 geosynchronous transfer orbit (GTO). The orbit is then circularized by firing the spacecraft's engine at apogee. Polar orbits (PO) are orbits with an inclination of 90 degrees. Polar orbits are useful for satellites that carry out mapping and/or surveillance operations because as the planet rotates the spacecraft has access to virtually every point on the planet's surface. There are two problems with the polar orbit. The first is that since the earth moves in its orbit around the Sun, the solar irradiance angle along the satellite varies continuously throughout the year (Figure 8.8). The second flaw is that, if all the satellites in north-south oriented orbits were polar orbiting satellites, they would tend to converge at the poles making collisions more likely. Walking orbits: An orbiting satellite is subjected to a great many gravitational influences. First, planets are not perfectly spherical and they have slightly uneven mass distribution. These fluctuations have an effect on a spacecraft's trajectory. Also, the sun, moon, and planets contribute a gravitational influence on an orbiting satellite. By carefully adjusting the orbit s inclination it is possible to design an orbit which takes advantage of these influences to induce a precession in the satellite's orbital plane. The resulting orbit is called a walking orbit, or precessing orbit. Sun synchronous orbits (SSO) are walking orbits whose orbital plane precesses with the same period as the planet's solar orbit period such that the satellite will cross the equator at about the same local time every orbit making it possible to maintain a more uniform solar irradiance angle throughout the mission s duration. (For the Earth, this is accomplished by selecting an inclination about 8 off the polar orbit.) This uniformity in the equator crossing time makes adjacent swaths as similar as possible since it reduces effects due to varying atmospheric path and BRDF. In order to maintain an exact synchronous timing, it may be necessary to conduct occasional propulsive maneuvers to adjust the orbit (a) Figure 8.8: Polar (a) and sun-synchronous (b) orbit orientations. (From l-mechanics.pdf (b)
Satellite/Aircraft Imaging Systems Imaging Sensors Standard scanner designs Image data formats
CEE 6150: Digital Image Processing 1 Satellite/Aircraft Imaging Systems Imaging Sensors Standard scanner designs Image data formats CEE 6150: Digital Image Processing 2 CEE 6150: Digital Image Processing
More informationSome Basic Concepts of Remote Sensing. Lecture 2 August 31, 2005
Some Basic Concepts of Remote Sensing Lecture 2 August 31, 2005 What is remote sensing Remote Sensing: remote sensing is science of acquiring, processing, and interpreting images and related data that
More informationOutline. Introduction. Introduction: Film Emulsions. Sensor Systems. Types of Remote Sensing. A/Prof Linlin Ge. Photographic systems (cf(
GMAT x600 Remote Sensing / Earth Observation Types of Sensor Systems (1) Outline Image Sensor Systems (i) Line Scanning Sensor Systems (passive) (ii) Array Sensor Systems (passive) (iii) Antenna Radar
More informationRemote Sensing Platforms
Types of Platforms Lighter-than-air Remote Sensing Platforms Free floating balloons Restricted by atmospheric conditions Used to acquire meteorological/atmospheric data Blimps/dirigibles Major role - news
More informationInt n r t o r d o u d c u ti t on o n to t o Remote Sensing
Introduction to Remote Sensing Definition of Remote Sensing Remote sensing refers to the activities of recording/observing/perceiving(sensing)objects or events at far away (remote) places. In remote sensing,
More informationNON-PHOTOGRAPHIC SYSTEMS: Multispectral Scanners Medium and coarse resolution sensor comparisons: Landsat, SPOT, AVHRR and MODIS
NON-PHOTOGRAPHIC SYSTEMS: Multispectral Scanners Medium and coarse resolution sensor comparisons: Landsat, SPOT, AVHRR and MODIS CLASSIFICATION OF NONPHOTOGRAPHIC REMOTE SENSORS PASSIVE ACTIVE DIGITAL
More informationAn Introduction to Geomatics. Prepared by: Dr. Maher A. El-Hallaq خاص بطلبة مساق مقدمة في علم. Associate Professor of Surveying IUG
An Introduction to Geomatics خاص بطلبة مساق مقدمة في علم الجيوماتكس Prepared by: Dr. Maher A. El-Hallaq Associate Professor of Surveying IUG 1 Airborne Imagery Dr. Maher A. El-Hallaq Associate Professor
More informationGovt. Engineering College Jhalawar Model Question Paper Subject- Remote Sensing & GIS
Govt. Engineering College Jhalawar Model Question Paper Subject- Remote Sensing & GIS Time: Max. Marks: Q1. What is remote Sensing? Explain the basic components of a Remote Sensing system. Q2. What is
More informationUS Commercial Imaging Satellites
US Commercial Imaging Satellites In the early 1990s, Russia began selling 2-meter resolution product from its archives of collected spy satellite imagery. Some of this product was down-sampled to provide
More information9/12/2011. Training Course Remote Sensing Basic Theory & Image Processing Methods September 2011
Training Course Remote Sensing Basic Theory & Image Processing Methods 19 23 September 2011 Remote Sensing Platforms Michiel Damen (September 2011) damen@itc.nl 1 Overview Platforms & missions aerial surveys
More informationLecture Notes Prepared by Prof. J. Francis Spring Remote Sensing Instruments
Lecture Notes Prepared by Prof. J. Francis Spring 2005 Remote Sensing Instruments Material from Remote Sensing Instrumentation in Weather Satellites: Systems, Data, and Environmental Applications by Rao,
More informationLecture 6: Multispectral Earth Resource Satellites. The University at Albany Fall 2018 Geography and Planning
Lecture 6: Multispectral Earth Resource Satellites The University at Albany Fall 2018 Geography and Planning Outline SPOT program and other moderate resolution systems High resolution satellite systems
More informationConsumer digital CCD cameras
CAMERAS Consumer digital CCD cameras Leica RC-30 Aerial Cameras Zeiss RMK Zeiss RMK in aircraft Vexcel UltraCam Digital (note multiple apertures Lenses for Leica RC-30. Many elements needed to minimize
More informationRemote Sensing Platforms
Remote Sensing Platforms Remote Sensing Platforms - Introduction Allow observer and/or sensor to be above the target/phenomena of interest Two primary categories Aircraft Spacecraft Each type offers different
More informationMicrowave Remote Sensing (1)
Microwave Remote Sensing (1) Microwave sensing encompasses both active and passive forms of remote sensing. The microwave portion of the spectrum covers the range from approximately 1cm to 1m in wavelength.
More information746A27 Remote Sensing and GIS. Multi spectral, thermal and hyper spectral sensing and usage
746A27 Remote Sensing and GIS Lecture 3 Multi spectral, thermal and hyper spectral sensing and usage Chandan Roy Guest Lecturer Department of Computer and Information Science Linköping University Multi
More informationRemote Sensing of the Environment An Earth Resource Perspective John R. Jensen Second Edition
Remote Sensing of the Environment An Earth Resource Perspective John R. Jensen Second Edition Pearson Education Limited Edinburgh Gate Harlow Essex CM20 2JE England and Associated Companies throughout
More informationIntroduction to Remote Sensing Fundamentals of Satellite Remote Sensing. Mads Olander Rasmussen
Introduction to Remote Sensing Fundamentals of Satellite Remote Sensing Mads Olander Rasmussen (mora@dhi-gras.com) 01. Introduction to Remote Sensing DHI What is remote sensing? the art, science, and technology
More informationThe studies began when the Tiros satellites (1960) provided man s first synoptic view of the Earth s weather systems.
Remote sensing of the Earth from orbital altitudes was recognized in the mid-1960 s as a potential technique for obtaining information important for the effective use and conservation of natural resources.
More informationPart I. The Importance of Image Registration for Remote Sensing
Part I The Importance of Image Registration for Remote Sensing 1 Introduction jacqueline le moigne, nathan s. netanyahu, and roger d. eastman Despite the importance of image registration to data integration
More informationGeometry of Aerial Photographs
Geometry of Aerial Photographs Aerial Cameras Aerial cameras must be (details in lectures): Geometrically stable Have fast and efficient shutters Have high geometric and optical quality lenses They can
More informationLeica ADS80 - Digital Airborne Imaging Solution NAIP, Salt Lake City 4 December 2008
Luzern, Switzerland, acquired at 5 cm GSD, 2008. Leica ADS80 - Digital Airborne Imaging Solution NAIP, Salt Lake City 4 December 2008 Shawn Slade, Doug Flint and Ruedi Wagner Leica Geosystems AG, Airborne
More informationLECTURE NOTES 2016 CONTENTS. Sensors and Platforms for Acquisition of Aerial and Satellite Image Data
LECTURE NOTES 2016 Prof. John TRINDER School of Civil and Environmental Engineering Telephone: (02) 9 385 5020 Fax: (02) 9 313 7493 j.trinder@unsw.edu.au CONTENTS Chapter 1 Chapter 2 Sensors and Platforms
More informationGEOMETRIC RECTIFICATION OF EUROPEAN HISTORICAL ARCHIVES OF LANDSAT 1-3 MSS IMAGERY
GEOMETRIC RECTIFICATION OF EUROPEAN HISTORICAL ARCHIVES OF LANDSAT -3 MSS IMAGERY Torbjörn Westin Satellus AB P.O.Box 427, SE-74 Solna, Sweden tw@ssc.se KEYWORDS: Landsat, MSS, rectification, orbital model
More informationIntroduction to Remote Sensing
Introduction to Remote Sensing Spatial, spectral, temporal resolutions Image display alternatives Vegetation Indices Image classifications Image change detections Accuracy assessment Satellites & Air-Photos
More informationCamera Case Study: HiSCI à now CaSSIS (Colour and Stereo Surface Imaging System)
Camera Case Study: HiSCI à now CaSSIS (Colour and Stereo Surface Imaging System) A camera for ESA s 2016 ExoMars Trace Gas Orbiter: h
More informationHyper-spectral, UHD imaging NANO-SAT formations or HAPS to detect, identify, geolocate and track; CBRN gases, fuel vapors and other substances
Hyper-spectral, UHD imaging NANO-SAT formations or HAPS to detect, identify, geolocate and track; CBRN gases, fuel vapors and other substances Arnold Kravitz 8/3/2018 Patent Pending US/62544811 1 HSI and
More informationCALIBRATION OF OPTICAL SATELLITE SENSORS
CALIBRATION OF OPTICAL SATELLITE SENSORS KARSTEN JACOBSEN University of Hannover Institute of Photogrammetry and Geoinformation Nienburger Str. 1, D-30167 Hannover, Germany jacobsen@ipi.uni-hannover.de
More informationSources of Geographic Information
Sources of Geographic Information Data properties: Spatial data, i.e. data that are associated with geographic locations Data format: digital (analog data for traditional paper maps) Data Inputs: sampled
More informationINF-GEO Introduction to remote sensing
INF-GEO 4310 Introduction to remote sensing Anne Solberg (anne@ifi.uio.no) Satellites, orbits and repeat cycles Optical remote sensings Based on a tutorial adapted from Canadian Center for Remote Sensing,
More informationChapter 5 Nadir looking UV measurement.
Chapter 5 Nadir looking UV measurement. Part-II: UV polychromator instrumentation and measurements -A high SNR and robust polychromator using a 1D array detector- UV spectrometers onboard satellites have
More informationRemote Sensing. Ch. 3 Microwaves (Part 1 of 2)
Remote Sensing Ch. 3 Microwaves (Part 1 of 2) 3.1 Introduction 3.2 Radar Basics 3.3 Viewing Geometry and Spatial Resolution 3.4 Radar Image Distortions 3.1 Introduction Microwave (1cm to 1m in wavelength)
More informationMETimage an innovative imaging radiometer for Post-EPS
METimage an innovative imaging radiometer for Post-EPS Dr. Christian Brüns 1, Dr. Matthias Alpers 1, Dr. Alexander Pillukat 2 1 DLR German Space Agency, Königswinterer Straße 522-524, D-53227 Bonn, Germany
More informationRemote Sensing 1 Principles of visible and radar remote sensing & sensors
Remote Sensing 1 Principles of visible and radar remote sensing & sensors Nick Barrand School of Geography, Earth & Environmental Sciences University of Birmingham, UK Field glaciologist collecting data
More informationOPAL Optical Profiling of the Atmospheric Limb
OPAL Optical Profiling of the Atmospheric Limb Alan Marchant Chad Fish Erik Stromberg Charles Swenson Jim Peterson OPAL STEADE Mission Storm Time Energy & Dynamics Explorers NASA Mission of Opportunity
More informationRemote Sensing Exam 2 Study Guide
Remote Sensing Exam 2 Study Guide Resolution Analog to digital Instantaneous field of view (IFOV) f ( cone angle of optical system ) Everything in that area contributes to spectral response mixels Sampling
More informationIntroduction to Remote Sensing. Electromagnetic Energy. Data From Wave Phenomena. Electromagnetic Radiation (EMR) Electromagnetic Energy
A Basic Introduction to Remote Sensing (RS) ~~~~~~~~~~ Rev. Ronald J. Wasowski, C.S.C. Associate Professor of Environmental Science University of Portland Portland, Oregon 1 September 2015 Introduction
More informationObservational Astronomy
Observational Astronomy Instruments The telescope- instruments combination forms a tightly coupled system: Telescope = collecting photons and forming an image Instruments = registering and analyzing the
More informationLaser Telemetric System (Metrology)
Laser Telemetric System (Metrology) Laser telemetric system is a non-contact gauge that measures with a collimated laser beam (Refer Fig. 10.26). It measure at the rate of 150 scans per second. It basically
More informationChapter 8. Remote sensing
1. Remote sensing 8.1 Introduction 8.2 Remote sensing 8.3 Resolution 8.4 Landsat 8.5 Geostationary satellites GOES 8.1 Introduction What is remote sensing? One can describe remote sensing in different
More informationEXAMPLES OF TOPOGRAPHIC MAPS PRODUCED FROM SPACE AND ACHIEVED ACCURACY CARAVAN Workshop on Mapping from Space, Phnom Penh, June 2000
EXAMPLES OF TOPOGRAPHIC MAPS PRODUCED FROM SPACE AND ACHIEVED ACCURACY CARAVAN Workshop on Mapping from Space, Phnom Penh, June 2000 Jacobsen, Karsten University of Hannover Email: karsten@ipi.uni-hannover.de
More informationAbstract Quickbird Vs Aerial photos in identifying man-made objects
Abstract Quickbird Vs Aerial s in identifying man-made objects Abdullah Mah abdullah.mah@aramco.com Remote Sensing Group, emap Division Integrated Solutions Services Department (ISSD) Saudi Aramco, Dhahran
More informationROLE OF SATELLITE DATA APPLICATION IN CADASTRAL MAP AND DIGITIZATION OF LAND RECORDS DR.T. RAVISANKAR GROUP HEAD (LRUMG) RSAA/NRSC/ISRO /DOS HYDERABAD
ROLE OF SATELLITE DATA APPLICATION IN CADASTRAL MAP AND DIGITIZATION OF LAND RECORDS DR.T. RAVISANKAR GROUP HEAD (LRUMG) RSAA/NRSC/ISRO /DOS HYDERABAD WORKSHOP on Best Practices under National Land Records
More information18. Infra-Red Imaging Subsystem (IRIS)
18. Infra-Red Imaging Subsystem (IRIS) Instrument Parameters Brodsky (1991) suggests the following parameters for remote sensing instruments: - focal plane detector, pattern, and cooling - dwell time on
More informationAtmospheric interactions; Aerial Photography; Imaging systems; Intro to Spectroscopy Week #3: September 12, 2018
GEOL 1460/2461 Ramsey Introduction/Advanced Remote Sensing Fall, 2018 Atmospheric interactions; Aerial Photography; Imaging systems; Intro to Spectroscopy Week #3: September 12, 2018 I. Quick Review from
More informationImage Fusion. Pan Sharpening. Pan Sharpening. Pan Sharpening: ENVI. Multi-spectral and PAN. Magsud Mehdiyev Geoinfomatics Center, AIT
1 Image Fusion Sensor Merging Magsud Mehdiyev Geoinfomatics Center, AIT Image Fusion is a combination of two or more different images to form a new image by using certain algorithms. ( Pohl et al 1998)
More informationCHAPTER 7: Multispectral Remote Sensing
CHAPTER 7: Multispectral Remote Sensing REFERENCE: Remote Sensing of the Environment John R. Jensen (2007) Second Edition Pearson Prentice Hall Overview of How Digital Remotely Sensed Data are Transformed
More informationREMOTE SENSING. Topic 10 Fundamentals of Digital Multispectral Remote Sensing MULTISPECTRAL SCANNERS MULTISPECTRAL SCANNERS
REMOTE SENSING Topic 10 Fundamentals of Digital Multispectral Remote Sensing Chapter 5: Lillesand and Keifer Chapter 6: Avery and Berlin MULTISPECTRAL SCANNERS Record EMR in a number of discrete portions
More informationImportant Missions. weather forecasting and monitoring communication navigation military earth resource observation LANDSAT SEASAT SPOT IRS
Fundamentals of Remote Sensing Pranjit Kr. Sarma, Ph.D. Assistant Professor Department of Geography Mangaldai College Email: prangis@gmail.com Ph. No +91 94357 04398 Remote Sensing Remote sensing is defined
More informationINF-GEO Introduction to remote sensing. Anne Solberg
INF-GEO 4310 Introduction to remote sensing Anne Solberg (anne@ifi.uio.no) Satellites, orbits and repeat cycles Optical remote sensing Useful links: Glossary for remote sensing terms: http://www.ccrs.nracn.gc.ca/glossary/index_e.php
More informationCHARACTERISTICS OF REMOTELY SENSED IMAGERY. Spatial Resolution
CHARACTERISTICS OF REMOTELY SENSED IMAGERY Spatial Resolution There are a number of ways in which images can differ. One set of important differences relate to the various resolutions that images express.
More information10 Satellite-Based Remote Sensing
10 Satellite-Based Remote Sensing 10.1 Introduction Those beginning to read the book at this chapter could find it troublesome with so many references to previous chapters, but it is the only way we found
More informationLecture 2. Electromagnetic radiation principles. Units, image resolutions.
NRMT 2270, Photogrammetry/Remote Sensing Lecture 2 Electromagnetic radiation principles. Units, image resolutions. Tomislav Sapic GIS Technologist Faculty of Natural Resources Management Lakehead University
More informationremote sensing? What are the remote sensing principles behind these Definition
Introduction to remote sensing: Content (1/2) Definition: photogrammetry and remote sensing (PRS) Radiation sources: solar radiation (passive optical RS) earth emission (passive microwave or thermal infrared
More informationChapter 3 Solution to Problems
Chapter 3 Solution to Problems 1. The telemetry system of a geostationary communications satellite samples 100 sensors on the spacecraft in sequence. Each sample is transmitted to earth as an eight-bit
More information9/12/2011. Training Course Remote Sensing Basic Theory & Image Processing Methods September 2011
Training Course Remote Sensing Basic Theory & Image Processing Methods 19 23 September 2011 Popular Remote Sensing Sensors & their Selection Michiel Damen (September 2011) damen@itc.nl 1 Overview Low resolution
More informationIntroduction to Remote Sensing
Introduction to Remote Sensing Outline Remote Sensing Defined Resolution Electromagnetic Energy (EMR) Types Interpretation Applications Remote Sensing Defined Remote Sensing is: The art and science of
More informationCHARACTERISTICS OF VERY HIGH RESOLUTION OPTICAL SATELLITES FOR TOPOGRAPHIC MAPPING
CHARACTERISTICS OF VERY HIGH RESOLUTION OPTICAL SATELLITES FOR TOPOGRAPHIC MAPPING K. Jacobsen Leibniz University Hannover, Institute of Photogrammetry and Geoinformation jacobsen@ipi.uni-hannover.de Commission
More information1 W. Philpot, Cornell University The Digital Image
1 The Digital Image DEFINITION: A grayscale image is a single-valued function of 2 variables: ff(xx 1, xx 2 ). Notes: A gray scale image is a single-valued function of two spatial variables, ff(xx 11,
More informationNUMERICAL ANALYSIS OF WHISKBROOM TYPE SCANNER IMAGES FOR ASSESSMENT OF OPEN SKIES TEST FLIGHTS
NUMERICAL ANALYSIS OF WHISKBROOM TYPE SCANNER IMAGES FOR ASSESSMENT OF OPEN SKIES TEST FLIGHTS Piotr Walczykowski, Wieslaw Debski Dept. of Remote Sensing and Geoinformation, Military University of Technology,
More informationAdvanced Optical Satellite (ALOS-3) Overviews
K&C Science Team meeting #24 Tokyo, Japan, January 29-31, 2018 Advanced Optical Satellite (ALOS-3) Overviews January 30, 2018 Takeo Tadono 1, Hidenori Watarai 1, Ayano Oka 1, Yousei Mizukami 1, Junichi
More informationRECOMMENDATION ITU-R S *
Rec. ITU-R S.1339-1 1 RECOMMENDATION ITU-R S.1339-1* Rec. ITU-R S.1339-1 SHARING BETWEEN SPACEBORNE PASSIVE SENSORS OF THE EARTH EXPLORATION-SATELLITE SERVICE AND INTER-SATELLITE LINKS OF GEOSTATIONARY-SATELLITE
More informationMASSACHUSETTS INSTITUTE OF TECHNOLOGY LINCOLN LABORATORY 244 WOOD STREET LEXINGTON, MASSACHUSETTS
MASSACHUSETTS INSTITUTE OF TECHNOLOGY LINCOLN LABORATORY 244 WOOD STREET LEXINGTON, MASSACHUSETTS 02420-9108 3 February 2017 (781) 981-1343 TO: FROM: SUBJECT: Dr. Joseph Lin (joseph.lin@ll.mit.edu), Advanced
More informationCoral Reef Remote Sensing
Coral Reef Remote Sensing Spectral, Spatial, Temporal Scaling Phillip Dustan Sensor Spatial Resolutio n Number of Bands Useful Bands coverage cycle Operation Landsat 80m 2 2 18 1972-97 Thematic 30m 7
More informationLE/ESSE Payload Design
LE/ESSE4360 - Payload Design 3.2 Spacecraft Sensors Introduction to Sensors Earth, Moon, Mars, and Beyond Dr. Jinjun Shan, Professor of Space Engineering Department of Earth and Space Science and Engineering
More informationRemote sensing image correction
Remote sensing image correction Introductory readings remote sensing http://www.microimages.com/documentation/tutorials/introrse.pdf 1 Preprocessing Digital Image Processing of satellite images can be
More informationREMOTE SENSING INTERPRETATION
REMOTE SENSING INTERPRETATION Jan Clevers Centre for Geo-Information - WU Remote Sensing --> RS Sensor at a distance EARTH OBSERVATION EM energy Earth RS is a tool; one of the sources of information! 1
More informationIKONOS High Resolution Multispectral Scanner Sensor Characteristics
High Spatial Resolution and Hyperspectral Scanners IKONOS High Resolution Multispectral Scanner Sensor Characteristics Launch Date View Angle Orbit 24 September 1999 Vandenberg Air Force Base, California,
More informationHigh Resolution Sensor Test Comparison with SPOT, KFA1000, KVR1000, IRS-1C and DPA in Lower Saxony
High Resolution Sensor Test Comparison with SPOT, KFA1000, KVR1000, IRS-1C and DPA in Lower Saxony K. Jacobsen, G. Konecny, H. Wegmann Abstract The Institute for Photogrammetry and Engineering Surveys
More informationChapters 1-3. Chapter 1: Introduction and applications of photogrammetry Chapter 2: Electro-magnetic radiation. Chapter 3: Basic optics
Chapters 1-3 Chapter 1: Introduction and applications of photogrammetry Chapter 2: Electro-magnetic radiation Radiation sources Classification of remote sensing systems (passive & active) Electromagnetic
More informationTHE INVESTIGATION OF VIEWING ANGLE EFFECTS ON GROUND SAMPLING DISTANCE OF THAICHOTE SATELLITE IMAGERY
THE INVESTIGATION OF VIEWING ANGLE EFFECTS ON GROUND SAMPLING DISTANCE OF THAICHOTE SATELLITE IMAGERY Sittipun Sangsuwan 1,2, Prasit Maksin 1, Poom Popattanachai 1, Chaichat Musana 1, Anuphao Aobpaet 1
More informationChapters 1 & 2. Definitions and applications Conceptual basis of photogrammetric processing
Chapters 1 & 2 Chapter 1: Photogrammetry Definitions and applications Conceptual basis of photogrammetric processing Transition from two-dimensional imagery to three-dimensional information Automation
More informationLecture 13: Remotely Sensed Geospatial Data
Lecture 13: Remotely Sensed Geospatial Data A. The Electromagnetic Spectrum: The electromagnetic spectrum (Figure 1) indicates the different forms of radiation (or simply stated light) emitted by nature.
More informationKEY TECHNOLOGY DEVELOPMENT FOR THE ADVENACED LAND OBSERVING SATELLITE
KEY TECHNOLOGY DEVELOPMENT FOR THE ADVENACED LAND OBSERVING SATELLITE Takashi HAMAZAKI, and Yuji OSAWA National Space Development Agency of Japan (NASDA) hamazaki.takashi@nasda.go.jp yuji.osawa@nasda.go.jp
More informationRadiometric Use of WorldView-3 Imagery. Technical Note. 1 WorldView-3 Instrument. 1.1 WorldView-3 Relative Radiance Response
Radiometric Use of WorldView-3 Imagery Technical Note Date: 2016-02-22 Prepared by: Michele Kuester This technical note discusses the radiometric use of WorldView-3 imagery. The first two sections briefly
More informationAn Introduction to Remote Sensing & GIS. Introduction
An Introduction to Remote Sensing & GIS Introduction Remote sensing is the measurement of object properties on Earth s surface using data acquired from aircraft and satellites. It attempts to measure something
More informationCompact High Resolution Imaging Spectrometer (CHRIS) siraelectro-optics
Compact High Resolution Imaging Spectrometer (CHRIS) Mike Cutter (Mike_Cutter@siraeo.co.uk) Summary CHRIS Instrument Design Instrument Specification & Performance Operating Modes Calibration Plan Data
More informationApplication of GIS to Fast Track Planning and Monitoring of Development Agenda
Application of GIS to Fast Track Planning and Monitoring of Development Agenda Radiometric, Atmospheric & Geometric Preprocessing of Optical Remote Sensing 13 17 June 2018 Outline 1. Why pre-process remotely
More information(Refer Slide Time: 00:10)
Fundamentals of optical and scanning electron microscopy Dr. S. Sankaran Department of Metallurgical and Materials Engineering Indian Institute of Technology, Madras Module 03 Unit-6 Instrumental details
More informationRADIOMETRIC TRACKING. Space Navigation
RADIOMETRIC TRACKING Space Navigation October 24, 2016 D. Kanipe Space Navigation Elements SC orbit determination Knowledge and prediction of SC position & velocity SC flight path control Firing the attitude
More informationJohn P. Stevens HS: Remote Sensing Test
Name(s): Date: Team name: John P. Stevens HS: Remote Sensing Test 1 Scoring: Part I - /18 Part II - /40 Part III - /16 Part IV - /14 Part V - /93 Total: /181 2 I. History (3 pts. each) 1. What is the name
More informationCALIBRATION OF IMAGING SATELLITE SENSORS
CALIBRATION OF IMAGING SATELLITE SENSORS Jacobsen, K. Institute of Photogrammetry and GeoInformation, University of Hannover jacobsen@ipi.uni-hannover.de KEY WORDS: imaging satellites, geometry, calibration
More informationTutorial 10 Information extraction from high resolution optical satellite sensors
Tutorial 10 Information extraction from high resolution optical satellite sensors Karsten Jacobsen 1, Emmanuel Baltsavias 2, David Holland 3 1 University of, ienburger Strasse 1, D-30167, Germany, jacobsen@ipi.uni-hannover.de
More informationCIRiS: Compact Infrared Radiometer in Space August, 2017
1 CIRiS: Compact Infrared Radiometer in Space August, 2017 David Osterman PI, CIRiS Mission Presented by Hansford Cutlip 10/8/201 7 Overview of the CIRiS instrument and mission The CIRiS instrument is
More informationOVERVIEW OF KOMPSAT-3A CALIBRATION AND VALIDATION
OVERVIEW OF KOMPSAT-3A CALIBRATION AND VALIDATION DooChun Seo 1, GiByeong Hong 1, ChungGil Jin 1, DaeSoon Park 1, SukWon Ji 1 and DongHan Lee 1 1 KARI(Korea Aerospace Space Institute), 45, Eoeun-dong,
More information3/31/03. ESM 266: Introduction 1. Observations from space. Remote Sensing: The Major Source for Large-Scale Environmental Information
Remote Sensing: The Major Source for Large-Scale Environmental Information Jeff Dozier Observations from space Sun-synchronous polar orbits Global coverage, fixed crossing, repeat sampling Typical altitude
More informationChapter 3. Data Acquisition Systems. Ayman F. Habib. Remote Sensing
Chapter 3 Data Acquisition Systems 1 Overview Utilized portions of the electro-magnetic radiation. Visible band. Infrared band. LIDAR systems. Microwave band (RADAR). Optical sensors (scanning operational
More informationComprehensive Vicarious Calibration and Characterization of a Small Satellite Constellation Using the Specular Array Calibration (SPARC) Method
This document does not contain technology or Technical Data controlled under either the U.S. International Traffic in Arms Regulations or the U.S. Export Administration Regulations. Comprehensive Vicarious
More informationOVERVIEW OF THE ALOS SATELLITE SYSTEM
OVERVIEW OF THE ALOS SATELLITE SYSTEM Presented to The Symposium for ALOS Data Application Users @Kogakuin University, Tokyo, Japan Mar. 27, 2001 Takashi Hamazaki Senior Engineer ALOS Project National
More informationWorldView-2. WorldView-2 Overview
WorldView-2 WorldView-2 Overview 6/4/09 DigitalGlobe Proprietary 1 Most Advanced Satellite Constellation Finest available resolution showing crisp detail Greatest collection capacity Highest geolocation
More informationChapter 6 Part 3. Attitude Sensors. AERO 423 Fall 2004
Chapter 6 Part 3 Attitude Sensors AERO 423 Fall 2004 Sensors The types of sensors used for attitude determination are: 1. horizon sensors (or conical Earth scanners), 2. sun sensors, 3. star sensors, 4.
More informationSensor resolutions from space: the tension between temporal, spectral, spatial and swath. David Bruce UniSA and ISU
Sensor resolutions from space: the tension between temporal, spectral, spatial and swath David Bruce UniSA and ISU 1 Presentation aims 1. Briefly summarize the different types of satellite image resolutions
More informationFundamentals of Remote Sensing
Climate Variability, Hydrology, and Flooding Fundamentals of Remote Sensing May 19-22, 2015 GEO-Latin American & Caribbean Water Cycle Capacity Building Workshop Cartagena, Colombia 1 Objective To provide
More informationDEM GENERATION WITH WORLDVIEW-2 IMAGES
DEM GENERATION WITH WORLDVIEW-2 IMAGES G. Büyüksalih a, I. Baz a, M. Alkan b, K. Jacobsen c a BIMTAS, Istanbul, Turkey - (gbuyuksalih, ibaz-imp)@yahoo.com b Zonguldak Karaelmas University, Zonguldak, Turkey
More informationP1.53 ENHANCING THE GEOSTATIONARY LIGHTNING MAPPER FOR IMPROVED PERFORMANCE
P1.53 ENHANCING THE GEOSTATIONARY LIGHTNING MAPPER FOR IMPROVED PERFORMANCE David B. Johnson * Research Applications Laboratory National Center for Atmospheric Research Boulder, Colorado 1. INTRODUCTION
More informationINTRODUCTION The validity of dissertation Object of investigation Subject of investigation The purpose: of the tasks The novelty:
INTRODUCTION The validity of dissertation. According to the federal target program "Maintenance, development and use of the GLONASS system for 2012-2020 years the following challenges were determined:
More informationEarth s Gravitational Pull
Satellite & Sensors Space Countries Earth s Gravitational Pull The Earth's gravity pulls everything toward the Earth. In order to orbit the Earth, the velocity of a body must be great enough to overcome
More informationModule 3 Introduction to GIS. Lecture 8 GIS data acquisition
Module 3 Introduction to GIS Lecture 8 GIS data acquisition GIS workflow Data acquisition (geospatial data input) GPS Remote sensing (satellites, UAV s) LiDAR Digitized maps Attribute Data Management Data
More informationVolume 1 - Module 6 Geometry of Aerial Photography. I. Classification of Photographs. Vertical
RSCC Volume 1 Introduction to Photo Interpretation and Photogrammetry Table of Contents Module 1 Module 2 Module 3.1 Module 3.2 Module 4 Module 5 Module 6 Module 7 Module 8 Labs Volume 1 - Module 6 Geometry
More informationRemote Sensing. Measuring an object from a distance. For GIS, that means using photographic or satellite images to gather spatial data
Remote Sensing Measuring an object from a distance For GIS, that means using photographic or satellite images to gather spatial data Remote Sensing measures electromagnetic energy reflected or emitted
More information