William B. Green, Danika Jensen, and Amy Culver California Institute of Technology Jet Propulsion Laboratory Pasadena, CA 91109

Similar documents
SECTION I - CHAPTER 2 DIGITAL IMAGING PROCESSING CONCEPTS

Preparing Remote Sensing Data for Natural Resources Mapping (image enhancement, rectifications )

VOYAGER IMAGE DATA COMPRESSION AND BLOCK ENCODING

Background. Computer Vision & Digital Image Processing. Improved Bartlane transmitted image. Example Bartlane transmitted image

Digital Image Processing

EC-433 Digital Image Processing

NASA s X2000 Program - an Institutional Approach to Enabling Smaller Spacecraft

Digital Photogrammetry. Presented by: Dr. Hamid Ebadi

Using RSVP for Analyzing State and Previous Activities for the Mars Exploration Rovers

High Fidelity 3D Reconstruction

Application of GIS to Fast Track Planning and Monitoring of Development Agenda

Texture characterization in DIRSIG

Cyber-Physical Systems

UltraCam and UltraMap Towards All in One Solution by Photogrammetry

Digital database creation of historical Remote Sensing Satellite data from Film Archives A case study

Chapter 1 Overview of imaging GIS

APPLICATION OF COMPUTER VISION FOR DETERMINATION OF SYMMETRICAL OBJECT POSITION IN THREE DIMENSIONAL SPACE

GE 113 REMOTE SENSING. Topic 7. Image Enhancement

Camera Requirements For Precision Agriculture

PIXELS. For the People: HiRISE Data Products

CHAPTER 2 A NEW SCHEME FOR SATELLITE RAW DATA PROCESSING AND IMAGE REPRESENTATION

Practical Image and Video Processing Using MATLAB

RADIOMETRIC CALIBRATION

MSPI: The Multiangle Spectro-Polarimetric Imager

Basics of Photogrammetry Note#6

Introduction to DSP ECE-S352 Fall Quarter 2000 Matlab Project 1

Aerial photography: Principles. Frame capture sensors: Analog film and digital cameras

Machine Processing Methods for Earth Observational Data

NORMALIZING ASTER DATA USING MODIS PRODUCTS FOR LAND COVER CLASSIFICATION

Camera Requirements For Precision Agriculture

Automatic processing to restore data of MODIS band 6

ROBOT VISION. Dr.M.Madhavi, MED, MVSREC

P1.53 ENHANCING THE GEOSTATIONARY LIGHTNING MAPPER FOR IMPROVED PERFORMANCE

REAL-TIME X-RAY IMAGE PROCESSING; TECHNIQUES FOR SENSITIVITY

A (very) brief introduction to Remote Sensing: From satellites to maps!

Hyperspectral Imagery: A New Tool For Wetlands Monitoring/Analyses

SENSING OF METAL-TRANSFER MODE FOR PROCESS CONTROL OF GMAW

8.2 IMAGE PROCESSING VERSUS IMAGE ANALYSIS Image processing: The collection of routines and

Image acquisition. In both cases, the digital sensing element is one of the following: Line array Area array. Single sensor

C. R. Weisbin, R. Easter, G. Rodriguez January 2001

Hello, welcome to the video lecture series on Digital Image Processing.

Color Image Acquisition Sam Liebo Lead Application Engineer February 2019

Introduction to Remote Sensing

Study of the Wide Angle and Stereo Cameras for JGO

Automated Planning for Spacecraft and Mission Design

Basic Digital Image Processing. The Structure of Digital Images. An Overview of Image Processing. Image Restoration: Line Drop-outs

Digital Image Processing

Communications in Space: A Deep Subject

2017 REMOTE SENSING EVENT TRAINING STRATEGIES 2016 SCIENCE OLYMPIAD COACHING ACADEMY CENTERVILLE, OH

CubeSat Integration into the Space Situational Awareness Architecture

[GEOMETRIC CORRECTION, ORTHORECTIFICATION AND MOSAICKING]

ENMAP RADIOMETRIC INFLIGHT CALIBRATION, POST-LAUNCH PRODUCT VALIDATION, AND INSTRUMENT CHARACTERIZATION ACTIVITIES

Design of a Remote-Cockpit for small Aerospace Vehicles

An Efficient Color Image Segmentation using Edge Detection and Thresholding Methods

RGB colours: Display onscreen = RGB

Motion Detection Keyvan Yaghmayi

High Performance Imaging Using Large Camera Arrays

THE MAPPING PERFORMANCE OF THE HRSC / SRC IN MARS ORBIT

THE NASA/JPL AIRBORNE SYNTHETIC APERTURE RADAR SYSTEM. Yunling Lou, Yunjin Kim, and Jakob van Zyl

Remote sensing image correction

A Method to Build Cloud Free Images from CBERS-4 AWFI Sensor Using Median Filtering

Educational Product. National Aeronautics and Space Administration. Educators. Grades 9 12 EG HQ. burning paper.

Basic Hyperspectral Analysis Tutorial

Image Processing (EA C443)

FLIGHT SUMMARY REPORT

ECC419 IMAGE PROCESSING

LLCD Accomplishments No Issues with Atmospheric Effects like Fading and Turbulence. Transmitting Data at 77 Mbps < 5 above the horizon

Y N C R O S C O P Y A DIVISION OF THE SYNOPTICS GROUP

Chapter 8. Remote sensing

The techniques with ERDAS IMAGINE include:

Separation of crop and vegetation based on Digital Image Processing

Images and Graphics. 4. Images and Graphics - Copyright Denis Hamelin - Ryerson University

Figure 1 HDR image fusion example

Section 2 Image quality, radiometric analysis, preprocessing

MINIMIZING SELECTIVE AVAILABILITY ERROR ON TOPEX GPS MEASUREMENTS. S. C. Wu*, W. I. Bertiger and J. T. Wu

DEEP SPACE TELECOMMUNICATIONS

DIGITAL IMAGING. Handbook of. Wiley VOL 1: IMAGE CAPTURE AND STORAGE. Editor-in- Chief

Sentinel-2 Products and Algorithms

Mars Spaceship All About Mars A Space Book For Kids Solar System And Planets For Children

1. Redistributions of documents, or parts of documents, must retain the SWGIT cover page containing the disclaimer.

Keywords: Data Compression, Image Processing, Image Enhancement, Image Restoration, Image Rcognition.

JPEG COMPRESS JPEG DECOMPRESS. Step 1 Steps 2 3 Step 4

MEASUREMENT OF THE EARTH-OBSERVER-1 SATELLITE X-BAND PHASED ARRAY

Govt. Engineering College Jhalawar Model Question Paper Subject- Remote Sensing & GIS

Chapters 1-3. Chapter 1: Introduction and applications of photogrammetry Chapter 2: Electro-magnetic radiation. Chapter 3: Basic optics

Chapter 8. Representing Multimedia Digitally

GEOMETRIC RECTIFICATION OF EUROPEAN HISTORICAL ARCHIVES OF LANDSAT 1-3 MSS IMAGERY

APPLICATIONS FOR TELECENTRIC LIGHTING

DEFENSE APPLICATIONS IN HYPERSPECTRAL REMOTE SENSING

Leveraging Commercial Communication Satellites to support the Space Situational Awareness Mission Area. Timothy L. Deaver Americom Government Services

Planet Labs Inc 2017 Page 2

29 th Annual Louisiana RS/GIS Workshop April 23, 2013 Cajundome Convention Center Lafayette, Louisiana

T I P S F O R I M P R O V I N G I M A G E Q U A L I T Y O N O Z O F O O T A G E

Introduction to image processing for remote sensing: Practical examples

A Study of Slanted-Edge MTF Stability and Repeatability

Computer Vision. Howie Choset Introduction to Robotics

Adapted from the Slides by Dr. Mike Bailey at Oregon State University

Photogrammetry. Lecture 4 September 7, 2005

What is Photogrammetry

UltraCam Eagle Prime Aerial Sensor Calibration and Validation

Transcription:

DIGITAL PROCESSING OF REMOTELY SENSED IMAGERY William B. Green, Danika Jensen, and Amy Culver California Institute of Technology Jet Propulsion Laboratory Pasadena, CA 91109 INTRODUCTION AND BASIC DEFINITIONS Digital images can be acquired from various devices. Image scanners on personal computers can generate digital images of hard copy material. New digital cameras operate without film, recording a digital image of the scene in local solid state memory. Remote sensing instruments routinely return digital imagery to receiving stations for processing and display. Digital processing of remotely sensed imagery is a technology that is now over thirty years old. Earth orbitting and deep space exploration spacecraft have been returning digital imagery for many years. Earth-based systems, including biomedical imaging devices and other commercially available types of equipment, have also been producing digital imagery for many years. Each of these devices produce a digital version of an image as a two dimensional array of numbers. The values in the matrix represent the brightness of the scene at each individual sampled position in the image. Figure 1 illustrates the basic concept of digital imagery. The left side of the figure shows a feature in object space consisting of adjacent black and white rectangles. The dashed line indicates the position of a single scan line extracted from a digital image representation of the scene. The sequence of numbers shows the digital intensity values assigned to each point along the scan line. '.. :14 12154586147198245250 253250 :.. Figure 1. Digital scan line across brightness transition. Review of Progress in Quantitative Nondestructive Evaluation. Vol. 17 Edited by D.O. Thompson and D.E. Chimenti" Plenum Press. New York, 1998 23

The figure illustrates several points. In this example, brightness is assumed to be represented by 8 bits per picture element (pixel). In this representation, 0 represents black and 255 represents full white. Note that the pixel values in the black square area are not equal to zero, and that the white pixel digital intensity is not exactly 255. Most digital image sensors have a base level, or dark current, and also introduce noise into the sampled signal. The second point illustrated by Figure 1 is that the sharp black-towhite transition in object space is not as sharp a transition in the digital representation. The ability to represent sharp transitions in object space within a digital image depends on the frequency of sampling, or resolution, of the digital image. A digital sampling of a real scene will never exactly reproduce the information in object space, and will introduce noise and other artifacts into the sampled imagery. Remotely sensed imagery is acquired using various telemetry systems that transfer digital data values from the instrument on the spacecraft through ground receiving stations and to the end user data system. Several factors can influence the quality of images received on the ground. Telemetry dropouts can cause portions of the image data to be lost in transmission. Noise in telemetry links can introduce noise over and above the basic sensor noise. When dealing with data compression versus information content, It may be necessary to compress the data significantly and thus reduce information content because of limited bandwidth or on-board data storage constraints. With multiple transmissions of the same data, the data can be sent several times from the spacecraft, or can be received at more than one ground station, resulting in different versions of the same basic information with different signal-to-noise characteristics. Figure 2. This shows one example of an image segment transmitted by the Galileo spacecraft. This is one segment of an image of Europa. Note periodic telemetry dropouts and significant noise introduced in several areas of the image. 24

The full image of Jupiter's moon Europa was actually received in several segments like this over a several day period and these segments were reconstructed to produce the final image shown in Figure 3 (which still has one missing segment that was never recovered). IMAGE ENHANCEMENT The image shown in Figure 3 is a very dark image. This is because the camera system is designed to be very sensitive to light levels, far more sensitive than the human eye, and can discriminate more gray levels than a human observer. When observing a scene, the camera generally records information within a small portion of the total available dynamic range of the camera system. One enhancement technique, called contrast enhancement or contrast stretch, expands the dynamic range of an image to take advantage of the full dynamic range available in the output display device (film or a workstation display are two examples). Figure 4 shows the result of remapping the intensity values of the image in Figure 3 to take advantage of the total dynamic range available in a film or paper print. Figure 3. Complete image of Europa constructed from image segments transmitted at four different times. 25

Figure 4. Europa image after contrast enhancement. Contrast stretching is one example of subjective image processing. This type of image processing is generally performed interactively and adaptively. The objective is to display the information content of the image. The processing may alter the true relationship between the brightness values in different areas of the image, or introduce other artifacts to exaggerate specific features in the data of interest to the end user. The degree of success in this type of processing is measured by the observer's ability to discern the information content of interest for the particular analysis. QUANTITATIVE IMAGE PROCESSING The objective of quantitative image processing is to provide an accurate quantitative rendition of information in object space. One example of quantitative image processing is the removal of instrument signature to produce a radiometrically accurate image. Another is to use cartographic projection to show correct relationships between objects and correct shapes. Color reconstruction from multiple images is another technique. Quantitative image processing is performed using pre-set algorithms and procedures. The processing is generally performed in a "hands-off" mode. There is no subjective evaluation or modification of the results of the processing. One example of quantitative image processing is cartographic projection. The purpose of cartographic projection is to transform an image into a standardized mapping projection (e.g., Mercator projection). Cartographic projection removes 26

viewing distortion and provides a reference system for detailed measurements of surface features and relationships between features. Cartographic projection is often performed on mosaics built from multiple images. The process of geometrically transforming individual images and automated mosaicking of multiple projected images is a quantitative process that is performed based on a knowledge of camera position and orientation and location information regarding the object being viewed. Figure 5 shows four individual images that have been enhanced to bring out the surface detail on Europa. Figure 6 shows the same four images built into a cartographically projected image. Figure 7 shows a higher level mosaic in which the four Galileo images of Europa have been superimposed onto an image acquired by the Voyager spacecraft in the 1970's. The dramatic difference in resolution of surface detail between the Galileo and Voyager images is dramatically illustrated by this computer generated mosaic. Figure 5. Four separate Galileo images of Europa, after image enhancement. The ragged edges are due to the compressed nature of the original data; the individual scan lines produced more data than could be accomodated by on-board storage buffers. 27

Figure 6. Computer generatf'd cartographically projected mosaic of the four Europa images in Figure 5. The compression artifacts have also been removed in producing this image. 28

Figure 7. The four Europa images acquired by Calileo overlaid on a Voyager image. All images have been projected to the same mapping projection so that the registration can be made. 29

Mars Pathfinder landed on July 4, 1997, and has been providing spectacular imagery of the Martian terrain from the Imager for Mars Pathfinder (IMP) camera on the Lander since the first day of the mission. The IMP camera has a 14 degree field of view, so it is necessary to construct mosaics from multiple images to obtain a context of the full scene. The IMP camera was provided for Mars Pathfinder by the University of Arizona under contract to JPL. The U of A team provided }PL with camera models based on preflight calibration tests that enabled first order correction for camera parallax in the near field. These models were incorporated into JPL geometric transformation and mosaicking software, and this made it possible to remove first order distortion effects from mosaics produced within minutes of receipt of data on the ground. Figure 8 shows a segment of the first mosaic produced within a few minutes of data acquisition. The IMP was in a stowed position, located very close to the rover that is seen on a solar panel prior to deployment onto the surface. In this first look mosaic, near field parallax produces obvious distortion in the imagery. The rover wheels appear split at the edges of adjacent images, the solar panel and portions of the rover appear to be bent out of shape, and other artifacts of the imaging geometry are visible. Figure 9 shows the same image mosaic after correction for the near field parallax based on the camera models and the viewing geometry. This improved mosaic was produced within 5 minutes after the mosaic in Figure 8 was produced, and was shown (in its color version) at the first press conference shortly after data acquisition. The curved lower boundary on the image in Figure 9 illustrates the degree of geometric correction that has been applied to the image mosaic to minimize distortion. Figure 8. First order mosaic produced within a few minutes of data receipt., with visible distortion due to near field parallax and the geometric nature of the image acquisition. 30

Figure 9. Initial correction of the mosaic in Figure 8. The color version of this mosaic incorporates over thirty individual images, and was produced within 5 minutes after the image in Figure 8 was available on July 4,1997. SUMMARY This paper has provided an introduction to some of the basic methods used to process imagery from remote sensing deep space missions. These methods have wide application in many areas of image analysis, and are in routine use on a variety of image types in many technical areas. ACKNOWLEDGEMENTS All images were produced at the Multimission Image Processing Laboratory (MIPL) at JPL. Mars Pathfinder image mosaics were produced at MIPL by Doug Alexander and Myche McAuly, using images processed in the real time system by Allan Runkle and software developed by Jean Lorre. The efforts of those individuals involved in the successful Galileo mission to Jupiter and the Mars Pathfinder mission, and those involved in designing and developing the science imaging payloads flown on those missions have provided us the opportunity of processing thousands of extraordinary images and are gratefully acknowledged. The support of Dr. Michael Belton, leader of the Galileo Imaging Science Team, and Dr. Peter Smith, Principal Investigator on the U of A IMP camera, are hereby acknowledged. This paper represents one phase of work performed at the Jet Propulsion Laboratory, California Institute of Technology under a contract with the National Aeronautics and Space Administration. 31