Atmospheric interactions; Aerial Photography; Imaging systems; Intro to Spectroscopy Week #3: September 12, 2018

Similar documents
FOR 353: Air Photo Interpretation and Photogrammetry. Lecture 2. Electromagnetic Energy/Camera and Film characteristics

Course overview; Remote sensing introduction; Basics of image processing & Color theory

746A27 Remote Sensing and GIS. Multi spectral, thermal and hyper spectral sensing and usage

Outline. Introduction. Introduction: Film Emulsions. Sensor Systems. Types of Remote Sensing. A/Prof Linlin Ge. Photographic systems (cf(

Lecture 2. Electromagnetic radiation principles. Units, image resolutions.

An Introduction to Remote Sensing & GIS. Introduction

Lecture Notes Prepared by Prof. J. Francis Spring Remote Sensing Instruments

Int n r t o r d o u d c u ti t on o n to t o Remote Sensing

remote sensing? What are the remote sensing principles behind these Definition

NON-PHOTOGRAPHIC SYSTEMS: Multispectral Scanners Medium and coarse resolution sensor comparisons: Landsat, SPOT, AVHRR and MODIS

Introduction to Remote Sensing

John P. Stevens HS: Remote Sensing Test

RADIOMETRIC CALIBRATION

Introduction to Remote Sensing. Electromagnetic Energy. Data From Wave Phenomena. Electromagnetic Radiation (EMR) Electromagnetic Energy

Chapter 5. Preprocessing in remote sensing

REMOTE SENSING. Topic 10 Fundamentals of Digital Multispectral Remote Sensing MULTISPECTRAL SCANNERS MULTISPECTRAL SCANNERS

CHARACTERISTICS OF REMOTELY SENSED IMAGERY. Radiometric Resolution

Govt. Engineering College Jhalawar Model Question Paper Subject- Remote Sensing & GIS

JP Stevens High School: Remote Sensing

On the use of water color missions for lakes in 2021

746A27 Remote Sensing and GIS

Ground Truth for Calibrating Optical Imagery to Reflectance

Photogrammetry. Lecture 4 September 7, 2005

Application of GIS to Fast Track Planning and Monitoring of Development Agenda

Mod. 2 p. 1. Prof. Dr. Christoph Kleinn Institut für Waldinventur und Waldwachstum Arbeitsbereich Fernerkundung und Waldinventur

Aerial photography: Principles. Frame capture sensors: Analog film and digital cameras

Geo/SAT 2 INTRODUCTION TO REMOTE SENSING

SFR 406 Spring 2015 Lecture 7 Notes Film Types and Filters

Volume 1 - Module 6 Geometry of Aerial Photography. I. Classification of Photographs. Vertical

Remote Sensing in Daily Life. What Is Remote Sensing?

REMOTE SENSING INTERPRETATION

Remote Sensing for Rangeland Applications

Blacksburg, VA July 24 th 30 th, 2010 Remote Sensing Page 1. A condensed overview. For our purposes

Remote Sensing 1 Principles of visible and radar remote sensing & sensors

The studies began when the Tiros satellites (1960) provided man s first synoptic view of the Earth s weather systems.

Final Examination Introduction to Remote Sensing. Time: 1.5 hrs Max. Marks: 50. Section-I (50 x 1 = 50 Marks)

GIS Data Collection. Remote Sensing

What is Photogrammetry

IKONOS High Resolution Multispectral Scanner Sensor Characteristics

Remote Sensing Platforms

Remote Sensing. Measuring an object from a distance. For GIS, that means using photographic or satellite images to gather spatial data

Lecture 13: Remotely Sensed Geospatial Data

Sommersemester Prof. Dr. Christoph Kleinn Institut für Waldinventur und Waldwachstum Arbeitsbereich Fernerkundung und Waldinventur.

earthobservation.wordpress.com

Remote Sensing Exam 2 Study Guide

Geometry of Aerial Photographs

Remote Sensing. Odyssey 7 Jun 2012 Benjamin Post

Chapters 1-3. Chapter 1: Introduction and applications of photogrammetry Chapter 2: Electro-magnetic radiation. Chapter 3: Basic optics

Introduction to Remote Sensing Part 1

NATIONAL OPEN UNIVERSITY OF NIGERIA COURSE CODE: ESM238 COURSE TITLE: ELEMENTS OF REMOTE SENSING AND AERIAL PHOTO INTERPRETATION

Remote sensing image correction

Satellite/Aircraft Imaging Systems Imaging Sensors Standard scanner designs Image data formats

Observational Astronomy

Introduction of Satellite Remote Sensing

ENMAP RADIOMETRIC INFLIGHT CALIBRATION, POST-LAUNCH PRODUCT VALIDATION, AND INSTRUMENT CHARACTERIZATION ACTIVITIES

Some Basic Concepts of Remote Sensing. Lecture 2 August 31, 2005

9/12/2011. Training Course Remote Sensing Basic Theory & Image Processing Methods September 2011

Lab 6: Multispectral Image Processing Using Band Ratios

Conceptual Physics Fundamentals

Remote Sensing of the Environment An Earth Resource Perspective John R. Jensen Second Edition

An Introduction to Geomatics. Prepared by: Dr. Maher A. El-Hallaq خاص بطلبة مساق مقدمة في علم. Associate Professor of Surveying IUG

Exercises The Color Spectrum (pages ) 28.2 Color by Reflection (pages )

Chapters 1-3. Chapter 1: Introduction and applications of photogrammetry Chapter 2: Electro-magnetic radiation. Chapter 3: Basic optics

Basic Digital Image Processing. The Structure of Digital Images. An Overview of Image Processing. Image Restoration: Line Drop-outs

Image Fusion. Pan Sharpening. Pan Sharpening. Pan Sharpening: ENVI. Multi-spectral and PAN. Magsud Mehdiyev Geoinfomatics Center, AIT

9/12/2011. Training Course Remote Sensing Basic Theory & Image Processing Methods September 2011

The FTNIR Myths... Misinformation or Truth

Leica ADS80 - Digital Airborne Imaging Solution NAIP, Salt Lake City 4 December 2008

Lecture 6: Multispectral Earth Resource Satellites. The University at Albany Fall 2018 Geography and Planning

Chapter 8. Remote sensing

Acquisition of Aerial Photographs and/or Satellite Imagery

Satellite Remote Sensing: Earth System Observations

Application of Remote Sensing in the Monitoring of Marine pollution. By Atif Shahzad Institute of Environmental Studies University of Karachi

BASIC PRINCIPLES OF REMOTE SENSING

Using Freely Available. Remote Sensing to Create a More Powerful GIS

Airborne hyperspectral data over Chikusei

Remote Sensing and GIS

Interpreting land surface features. SWAC module 3

Acquisition of Aerial Photographs and/or Imagery

AR M. Sc. (Rural Technology) II Semester Fundamental of Remote Sensing Model Paper

Textbook, Chapter 15 Textbook, Chapter 10 (only 10.6)

Chapter 16 Light Waves and Color

Human Retina. Sharp Spot: Fovea Blind Spot: Optic Nerve

Lecture 9. Lecture 9. t (min)

Sources of Geographic Information

Image Band Transformations

366 Glossary. Popular method for scale drawings in a computer similar to GIS but without the necessity for spatial referencing CEP

RGB colours: Display onscreen = RGB

Sample Copy. Not For Distribution.

Basic Hyperspectral Analysis Tutorial

Chapter 5 Nadir looking UV measurement.

29 th Annual Louisiana RS/GIS Workshop April 23, 2013 Cajundome Convention Center Lafayette, Louisiana

Sensors and Data Interpretation II. Michael Horswell

Astro-photography. Daguerreotype: on a copper plate

University of Texas at San Antonio EES 5053 Term Project CORRELATION BETWEEN NDVI AND SURFACE TEMPERATURES USING LANDSAT ETM + IMAGERY NEWFEL MAZARI

Two-linear-polarization measurement of O 2 A band with TANSO-FTS onboard GOSAT

INTRODUCTION TO REMOTE SENSING AND ITS APPLICATIONS

Monitoring agricultural plantations with remote sensing imagery

LlIGHT REVIEW PART 2 DOWNLOAD, PRINT and submit for 100 points

Sentinel-2 Products and Algorithms

Transcription:

GEOL 1460/2461 Ramsey Introduction/Advanced Remote Sensing Fall, 2018 Atmospheric interactions; Aerial Photography; Imaging systems; Intro to Spectroscopy Week #3: September 12, 2018 I. Quick Review from Last Lecture summarize the main points that you took from lecture 1: o o o o o questions? II. Atmosphere atmospheric window: regions that are not blocked by the Earth's atmospheric gases (so we can see the surface) o have high atmospheric transmission and low absorption o H2O, CO2 and O3 are the main gas species that absorb photons in the VIS - TIR o even within the atmospheric windows, the energy is interacting with gases and particulates, so no region is 100% clear! general stages of image processing: move from DN to radiance to calibrated radiance to physical properties of the material (reflection, emissivity, temperature, etc.) o DN to radiance (energy) at sensor generally, a linear function: gain and offsets applied for every instrument o radiance at sensor to radiance at surface removal of atmospheric terms (path radiance) radiance at sensor: path radiance + ground radiance

o atmospheric "correction" algorithms in remote sensing are designed to remove or lessen the contribution of the path radiance to get at the absolute ground radiance path radiance: any energy contributed by interactions with the atmosphere over the path-length prior to detection path-length: distance traveled through the atmosphere by a photon o function of the location of the energy source, location of the sensor and the wavelength o example: reflected solar energy travels through the atmosphere twice before detection, but emitted thermal wavelengths only travel once transmissivity (τ): measure of the fraction of energy that passes through the atmosphere unattenuated (varies between 0 and 1) o τ = 1 (perfectly clear atmosphere) scattering of surface radiance from particles in the atmosphere o 3 types: 1. selective scattering (aka, Rayleigh scattering) caused by particles much less than the size of the scattered wavelengths atmospheric gases (N2, O2, O3) effects VIS shorter wavelengths more (UV - VIS blue) example: that is why the sky is blue on Earth none of these gases present in significant quantities on Mars for example example: Martian atmosphere scatters the longer red wavelengths due mostly to dust (see type #2) 2. selective scattering (aka, Mie scattering) caused by particles about equal to the wavelength example: dust, smoke, aerosols longer VIS wavelengths are affected more (reddish coloration) example: pollution or volcanic eruptions cause very red sunsets 3. non-selective scattering caused by particles much larger than the wavelength example: water vapor, ice crystals all wavelengths are effected (white coloration) example: clouds, haze, etc.

low amount of non-selective scattering much higher amount of non-selective scattering III. Cameras/Aerial Photography cameras are photon detectors (different from imaging scanners) o examples: film, vidicons, charged-couple devices (CCDs) o absorption of a photon breaks an electron free from its binding atom o this change in energy state can be measured electrically o different detector material for different wavelength regions examples: Ag-halide (film), Si (VIS), KBr (SWIR - TIR), HgCdTe (TIR)

o these detectors are just one part of a remote sensing instrument: i. fore optics (primary and secondary mirrors/telescope) ii. beam splitter iii. detector iv. electronics v. storage framing camera using film (much more rare these days!) o what is film? light-sensitive emulsion material embedded with silver-halide crystals coarseness of these crystals determines the resolving power of the film (aka, speed) photochemical reaction of photon liberating electrons creating silver atoms developing uses chemicals to convert exposed Ag-halide atoms into silver all unexposed grains are removed to leave clear areas exposed regions remain and are dark (brightest parts of the scene are the darkest in the developed film negative image printing on paper produces a positive image "negative" color image "positive" color image o with film, a 2-D image is acquired instantly positives: high spatial resolution, low costs, large amount of data captured negatives: limited spectral range, non-digital, high geometric distortion @ edges o ground resolution = ability to resolve ground features (expressed as the number of line pairs per m) Rg = (Rs f)/h where, Rs = system resolution (mm); f = focal length of camera (mm); H = camera height above ground (m) whereas, the width of an individually-resolved line pair = Rg -1 scale = f/h commonly written as 1:20,000 1 mm on the photograph = 20,000 mm (20m) on the ground

o relief displacement geometric distortion at image edges giving the effect that taller objects are leaning away from the optical center of the photo distortion amount is related to: 1. vertical height of the object 2. distance from the principal point 3. inversely proportional to the camera height h = (H d) / r where, h = actual height of the object (m) ; H = camera height above ground (m); r = distance from image center to the top of the object (m); d = relief displacement removal of large-scale relief displacement produces an ortho-photograph stereo-pairs = successive overlapping air photos because each photograph images each point on the ground from a slightly different angle, the offsets can be used to reproduce the vertical dimension known as a DEM (digital elevation model) what are used to produce the USGS topographic maps o sun angle low sun angle: images taken generally early morning, late afternoon, or high latitudes, where the sun is < 15 above the horizon produces pronounced shadows if object is perpendicular to sun excellent for interpretation of subtle topographic features high sun angle: what benefits do you see in the following images?

IV. Imaging Systems: Scanners systems used to build up electronic images line by line/row by row o most common form of orbital sensors dwell time o dwell time = scan time per line / number of cells per line o in other words, the amount of time a scanner has to collect photons from a ground resolution cell o translates to: (down-track pixel size / orbital velocity) (cross-track line width / cross-track pixel size) o for the Landsat Thematic Mapper (TM) scanner dwell time = [ (30 m / 7500 m/s) / ( 185,000 m / 30m) ] dwell time = 6.5 x 10-7 sec for each pixel o very short time per pixel -- low signal to noise ratio o need to find ways to increase the dwell time for better data Types: cross-track scanner o rotation or "back and forth" motion of the foreoptics o scans each ground resolution cell (pixel) one by one along-track scanner o multiple cross-track detectors (no scanning motion) o positives: dwell time increases. Why? in the dwell time equation, the denominator = 1.0 since the line width is in effect the cross track width of the pixel

equation reduces to: dwell time = (down-track pixel size / orbital velocity) dwell time = 4.0 x 10-3 sec/pixel (for the above example) o negatives: large arrays are difficult to fabricate (TM would require 6200 elements), failure of one element produces a loss/miscalibration of an entire column of data (see below) Image A is an example of push-broom line array errors in band 4 of the ASTER sensor; image B is an example of cross-scanner array errors in band 10 of ASTER. whisk-broom scanner o combination of a cross-track scanner and a push-broom scanner o scan with a small line array of detectors o positives: longer dwell time (several lines per scan motion) if all detectors are the same wavelength same dwell time as the cross-track scanner if each detector was tuned to a different wavelength o negatives: different response sensitivities in each detector can cause striping in the image (see above) multispectral scanners o thus far, we have looked at scanners with just one spectral band o how do we add multiple wavelength observations? o add cross-track scanning with a line array o different than a whisk-broom there, the scanning is done with a line array of the same wavelength here, the scanning is performed with a line array of detectors at different wavelengths negatives: short dwell time again, spacecraft movement, planet rotation causes imprecise alignment λ1 X λ2 X λ3 X scan direction flight direction

2 solutions: 1. push-broom scanning with a 2-D array λ1 X X X λ2 X X X λ3 X X X X X X flight direction 2. whisk-broom scanning with a 2-D array (TM scanner) λ1 λ2 λ3 λn X X X X X X X X scan direction X X X X flight direction V. Imaging Systems: Spectral Resolution information interpretation o what is spectral resolution? quantized spectrum for each pixel over the number of instrument channels multi-spectral vs. hyper-spectral data energy returned from the surface and detected by the sensor is quantized over some wavelength region broken down into some number of discrete instrument channels how? o bandpass filters subdivide the EM spectrum of the pixel into discrete wavelength channels each pixel in the image is one wavelength channel each image comprises one channel of all the pixels a multi-channel image each channel can be placed in either the red, green or blue color of a remote sensing software package o channel width: width of the filter (band) at 50% of the peak response o FWHM: full width/half max measure of the spectral width of each wavelength channel

FWHM example: sunlight reflected off a green leaf produces a spectrum that contains info on the amount and type of chlorophyll pigments spectrum is continuous (many points) but a multispectral sensor will only detect energy over the number of wavelength regions corresponding to the number of bandpass filters example: a 3-point spectrum (multi-spectral instrument) another instrument may have hundreds of channels in this wavelength region (hyper-spectral instrument) Visible/near infrared (VNIR) spectra of common desert vegetation showing three wavelength bands (VNIR 1, VNIR 2, VNIR 3) of a multi-spectral instrument. However, the VNIR spectra are 1000 s of points.

VI. Spectroscopy spectroscopy: science and analysis of the EM spectra of materials o type of spectroscopy is a function of the wavelength region under study gamma ray spectroscopy, TIR spectroscopy, etc. o the analysis of the spectrum tells you something about the surface material talked some about this last week spectral features caused by electronic processes within atoms and vibrational processes between atoms example (next page): thermal infrared (TIR) emissivity spectra emissivity lows indicate regions of fundamental vibrations of the bonds between the Si -- O atoms much more detail on all this later in the course