Radiometric and Geometric Correction Methods for Active Radar and SAR Imageries

Similar documents
ACTIVE SENSORS RADAR

Acknowledgment. Process of Atmospheric Radiation. Atmospheric Transmittance. Microwaves used by Radar GMAT Principles of Remote Sensing

Remote Sensing. Ch. 3 Microwaves (Part 1 of 2)

Microwave Remote Sensing (1)

Introduction Active microwave Radar

Synthetic aperture RADAR (SAR) principles/instruments October 31, 2018

NOISE REMOVAL TECHNIQUES FOR MICROWAVE REMOTE SENSING RADAR DATA AND ITS EVALUATION

Performance evaluation of several adaptive speckle filters for SAR imaging. Markus Robertus de Leeuw 1 Luis Marcelo Tavares de Carvalho 2

Radar Imaging Wavelengths

Application of GIS to Fast Track Planning and Monitoring of Development Agenda

Radar Imagery Filtering with Use of the Mathematical Morphology Operations

RADAR (RAdio Detection And Ranging)

Introduction to Radar

RADAR REMOTE SENSING

10 Radar Imaging Radar Imaging

CEGEG046 / GEOG3051 Principles & Practice of Remote Sensing (PPRS) 8: RADAR 1

Microwave Remote Sensing

ANALYSIS OF SRTM HEIGHT MODELS

Microwave remote sensing. Rudi Gens Alaska Satellite Facility Remote Sensing Support Center

EE 529 Remote Sensing Techniques. Introduction

Remote sensing image correction

Preparing Remote Sensing Data for Natural Resources Mapping (image enhancement, rectifications )

Introduction to Imaging Radar INF-GEO 4310

An Introduction to Geomatics. Prepared by: Dr. Maher A. El-Hallaq خاص بطلبة مساق مقدمة في علم. Associate Professor of Surveying IUG

SAR IMAGE ANALYSIS FOR MICROWAVE C-BAND FINE QUAD POLARISED RADARSAT-2 USING DECOMPOSITION AND SPECKLE FILTER TECHNIQUE

Remote Sensing. The following figure is grey scale display of SPOT Panchromatic without stretching.

SATELLITE OCEANOGRAPHY

Synthetic Aperture Radar

Imaging radar Imaging radars provide map-like coverage to one or both sides of the aircraft.

All rights reserved. ENVI, IDL and Jagwire are trademarks of Exelis, Inc. All other marks are the property of their respective owners.

SAR Othorectification and Mosaicking

ESA Radar Remote Sensing Course ESA Radar Remote Sensing Course Radar, SAR, InSAR; a first introduction

The techniques with ERDAS IMAGINE include:

Mod. 2 p. 1. Prof. Dr. Christoph Kleinn Institut für Waldinventur und Waldwachstum Arbeitsbereich Fernerkundung und Waldinventur

Edge Detection in SAR Images using Phase Stretch Transform

Introduction to Microwave Remote Sensing

An edge-enhancing nonlinear filter for reducing multiplicative noise

TerraSAR-X. Value Added Product Specification

Basic Digital Image Processing. The Structure of Digital Images. An Overview of Image Processing. Image Restoration: Line Drop-outs

SAR AUTOFOCUS AND PHASE CORRECTION TECHNIQUES

Active and Passive Microwave Remote Sensing

remote sensing? What are the remote sensing principles behind these Definition

Correcting topography effects on terrestrial radar maps

Feature Variance Based Filter For Speckle Noise Removal

Digital Image Processing

Govt. Engineering College Jhalawar Model Question Paper Subject- Remote Sensing & GIS

Planet Labs Inc 2017 Page 2

SAR Remote Sensing (Microwave Remote Sensing)

[GEOMETRIC CORRECTION, ORTHORECTIFICATION AND MOSAICKING]

Specificities of Near Nadir Ka-band Interferometric SAR Imagery

The Radar Ortho Suite is an add-on to Geomatica. It requires Geomatica Core or Geomatica Prime as a pre-requisite.

746A27 Remote Sensing and GIS

Radar Imagery for Forest Cover Mapping

Noise Reduction Technique in Synthetic Aperture Radar Datasets using Adaptive and Laplacian Filters

MODULE 9 LECTURE NOTES 2 ACTIVE MICROWAVE REMOTE SENSING

SARscape Modules for ENVI

Interpreting Digital RADAR Images

TerraSAR-X Applications Guide

Image interpretation and analysis

Enhanced Noise Removal Technique Based on Window Size for SAR Data

A. Dalrin Ampritta 1 and Dr. S.S. Ramakrishnan 2 1,2 INTRODUCTION

ENVI Tutorial: Orthorectifying Aerial Photographs

Generation of Fine Resolution DEM at Test Areas in Alaska Using ERS SAR Tandem Pairs and Precise Orbital Data *

Active and Passive Microwave Remote Sensing

Review. Guoqing Sun Department of Geography, University of Maryland ABrief

Estimation of Ocean Current Velocity near Incheon using Radarsat-1 SAR and HF-radar Data

Co-ReSyF RA lecture: Vessel detection and oil spill detection

THE NASA/JPL AIRBORNE SYNTHETIC APERTURE RADAR SYSTEM. Yunling Lou, Yunjin Kim, and Jakob van Zyl

Image Fusion. Pan Sharpening. Pan Sharpening. Pan Sharpening: ENVI. Multi-spectral and PAN. Magsud Mehdiyev Geoinfomatics Center, AIT

Index 275. K Ka-band, 250, 259 Knowledge-based concepts, 110

TEMPORAL ANALYSIS OF MULTI EPOCH LANDSAT GEOCOVER IMAGES IN ZONGULDAK TESTFIELD

Remote Sensing 1 Principles of visible and radar remote sensing & sensors

An Introduction to Remote Sensing & GIS. Introduction

SAR Imagery: Airborne or Spaceborne? Presenter: M. Lorraine Tighe PhD

Introduction to Remote Sensing

ACTIVE MICROWAVE REMOTE SENSING OF LAND SURFACE HYDROLOGY

DEMS BASED ON SPACE IMAGES VERSUS SRTM HEIGHT MODELS. Karsten Jacobsen. University of Hannover, Germany

An Approach To Correct The Raw FCC Satellite Image

Fringe 2015 Workshop

Remote Sensing. Odyssey 7 Jun 2012 Benjamin Post

INTEGRATED DEM AND PAN-SHARPENED SPOT-4 IMAGE IN URBAN STUDIES

GEOMETRIC RECTIFICATION OF EUROPEAN HISTORICAL ARCHIVES OF LANDSAT 1-3 MSS IMAGERY

GEO 428: DEMs from GPS, Imagery, & Lidar Tuesday, September 11

Compression Method for High Dynamic Range Intensity to Improve SAR Image Visibility

Ground Truth for Calibrating Optical Imagery to Reflectance

Detection of a Point Target Movement with SAR Interferometry

Geomatica OrthoEngine Orthorectifying SPOT6 data

PLANET IMAGERY PRODUCT SPECIFICATIONS PLANET.COM

SAR Remote Sensing. Introduction into SAR. Data characteristics, challenges, and applications.

Introduction to Remote Sensing Fundamentals of Satellite Remote Sensing. Mads Olander Rasmussen

Use of Synthetic Aperture Radar images for Crisis Response and Management

DIFFERENTIAL APPROACH FOR MAP REVISION FROM NEW MULTI-RESOLUTION SATELLITE IMAGERY AND EXISTING TOPOGRAPHIC DATA

Detection of traffic congestion in airborne SAR imagery

Geomatica OrthoEngine v10.2 Tutorial Orthorectifying ALOS PRISM Data Rigorous and RPC Modeling

Outline Remote Sensing Defined Resolution Electromagnetic Energy (EMR) Types Interpretation Applications

MODULE 4 LECTURE NOTES 4 DENSITY SLICING, THRESHOLDING, IHS, TIME COMPOSITE AND SYNERGIC IMAGES

IMAGINE StereoSAR DEM TM

APPLICATION OF REMOTE SENSING DATA FOR OIL SPILL MONITORING IN THE GUANABARA BAY, RIO DE JANEIRO, BRAZIL

CanImage. (Landsat 7 Orthoimages at the 1: Scale) Standards and Specifications Edition 1.0

Playa del Rey, California InSAR Ground Deformation Monitoring Interim Report H

Transcription:

Radiometric and Geometric Correction Methods for Active Radar and SAR Imageries M. Mansourpour 1, M.A. Rajabi 1, Z. Rezaee 2 1 Dept. of Geomatics Eng., University of Tehran, Tehran, Iran mansourpour@gmail.com, marajabi@ut.ac.ir 2 National Cartographic Centre (NCC), Tehran, Iran rezaee.ncc@gmail.com Abstract Radar imagery has become one of the most important data sources and efficient tools for terrain analysis and natural resource surveys since 1960s. With the development of technology in the field of radar remote sensing, new generation of radar sensors, i.e., Synthetic Aperture Radar (SAR) was born. Unique specifications of radar systems and images versus optical ones led to a whole new series of applications for radar imageries all over the world. However, the level of achievable accuracy from radar imageries is still a problem for their applications. Multiplicative noise such as speckle which is unavoidable part of coherent radar images, degrade radiometric quality and interpretability. Moreover, geometric distortions such as foreshortening, layover, shadow and other problems related to special imaging geometry of radar systems, decrease reliability of radar imageries. Thus, radiometric and geometric corrections and calibrations must be applied to the radar images before using them. This paper uses four filters with different window sizes to remove/reduce the speckle noise. These filters are Lee, Lee-Sigma, Gamma- MAP and Frost. It is shown that Gamma-MAP filter has a better performance than the other three filters, if the ratio of Mean/Std is used as a criterion. Moreover, geometric correction is done using three different methods including polynomial with five control points, geocoding with ephemeris, and orthorectification using ephemeris, DTM and control points. The results show that the last method which is called radargrammetric is more successful in removing the effects of foreshortening and layover. Besides, the accuracy of geometric correction using radargrammetry is better than the other two methods too. KEY WORDS: SAR Images, Speckle Noise, Filter, Foreshortening, Layover, Shadow, Orthorectification 1. Introduction According to Lillesand and Kiefer (2000) remote sensing is the science and art of obtaining information about an object, area, or phenomenon through the analysis of data acquired by a device that is not in contact with the object, area, or phenomenon under investigation. Based on the wavelength in which the system works, remote sensing is categorized into two different groups, i.e., optical and microwave. Optical remote sensing uses visible and infrared waves while microwave remote sensing uses radio waves [1]. As a microwave remote sensing, RADAR (Radio Detection And Ranging) sends out pulses of microwave electromagnetic radiation and measures the strength as well as time between the transmitted and reflected pulses to determine both the type of reflector and its distance from the transmitter (Raney, 1998). Different pulse intervals, different wavelengths (which range between less than 1 mm to 1 m), different geometry and polarizations can all be used to determine the roughness, geometry and moisture content of the earth surface [2]. During the past two decades different satellitesusing RADAR sensors have been put into the orbit. SEASAT, SIR-A, SIR-B, SIR-C, ERS-1, ERS-2, ALMAZ, JERS-1, and RADARSAT are some of satellite missions which use RADAR technology. 1

Radar remote sensing, like optical remote sensing, is used to produce the image of Earth s surface. A radar image is a record of the interaction of energy and objects at the Earth s surface. Its appearance is dependent on variables such as geometric shape, surface roughness and moisture content of the target object, as well as the sensor-target geometry and the transmission direction (look direction) of the radar sensor. There are significant differences, however, between how a radar image is formed and what is represented in that image compared to optical remote sensing imagery [3]. In compare to optical remote sensing, radar imaging has some advantages. First, as an active system, it is a day/night data acquisition system. Second, considering the behavior of electromagnetic waves in the range of RADAR wavelength, it can be seen that atmospheric characteristics such as cloud, light rain, haze, and smoke has little effect on the capability of RADAR data acquisition system. This makes RADAR as an allweather remote sensing system. Last but not least, as the RADAR signals partially penetrate into soil and vegetation canopy, in addition to surface information, it can provide subsurface information too. The returned signal (backscatter) from ground objects (targets) is primarily influenced by the characteristics of the radar signal, the geometry of the radar relative to the Earth s surface, the local geometry between the radar signal and its target, and the characteristics of the target. A radar image is a display of grey tones which are proportional to the amount of backscatter that is received from a target. Targets that produce a large amount of backscatter will appear as light grey tones on a radar image. Targets that produce little backscatter will appear as dark grey tones, and targets that reflect intermediate amounts of backscatter will appear as intermediate grey tones. A Synthetic Aperture Radar (SAR) system illuminates a scene with microwaves and records both the amplitude and the phase of the back-scattered radiation, making it a coherent imaging process. The received signal is sampled and converted into a digital image. The field recorded at pixel x, denoted E(x), can be written as [4] E ( x) = a( s) exp( iϕ ( s)) h( s, x) (1) s Where the summation ranges over the scatterers, a(s) and are respectively the amplitude and phase of the signal received from scatterer s, and h is the instrument (or point-spread) function. The value of h is near 1 when s is in or near the resolving cell corresponding to pixel x, and near zero otherwise. Assuming that h is translation-invariant (does not depend on x) then it can be written as a oneparameter function h(s-x). The detected field E is an array of complex numbers. The square of the modulus of the field at x is called the detected intensity at x; the square-root of the intensity is called the envelope or the amplitude. This is not the same as the amplitude of the received signal because the received field is perturbed by the instrument function. The amplitude of the received signal, a(s), is called the reflectivity, and its square is called the surface cross-section. Unfortunately, this is contaminated with speckle noise and the goal of all speckle noise reduction methods is to recover it. Inherent with all RADAR imageries is speckle noise which is nothing else but variation in backscatter from inhomogeneous cells. Speckle noise gives a grainy appearance to radar imageries. It reduces the image contrast which has a direct negative effect on texture based analysis of the imageries [3]. Meanwhile, speckle noise also changes the spatial statistics of the underlying scene backscatter which in turn makes the classification of imageries a difficult task [5]. Obviously, it is seen that to interpret RADAR imageries correctly one has to reduce (ideally remove!) the effect of speckle noise. However, as the speckle noise reduction/removal process changes the image as well, one should use proper filter to keep the image degradation minimum. This paper reviews the speckle noise reduction methods and 2

among all studies the effect of Lee-sigma, Lee, Gamma-MAP, and Frost filters with different kernel sizes on the SAR imageries. 2. SPECKLE NOISE AND ITS REDUCTION Radar waves can interfere constructively or destructively to produce light and dark pixels known as speckle noise. Speckle noise is commonly observed in radar (microwave or millimeter wave) sensing systems, although it may appear in any type of remotely sensed image utilizing coherent radiation. Like the light from a laser, the waves emitted by active sensors travel in phase and interact minimally on their way to the target area. After interaction with the target area, these waves are no longer in phase because of the different distances they travel from targets, or single versus multiple bounce scattering. Once out of phase, radar waves can interact to produce light and dark pixels known as speckle noise. Speckle noise in radar data is assumed to have multiplicative error model and must be reduced before the data can be utilized otherwise the noise is incorporated into and degrades the image quality. Ideally, speckle noise in radar images must be completely removed. However, in practice it can be reduced significantly. Reducing the effect of speckle noise permits both better discrimination of scene targets and easier automatic image segmentation[3][5]. The spatial filters are categorized into two different groups, i.e., non-adaptive and adaptive. Non-adaptive filters take the parameters of the whole image signal into consideration and leave out the local properties of the terrain backscatter or the nature of the sensor. These kinds of filters are not appropriate for non-stationary scene signal. Fast Fourier Transform (FFT) is an example of such filters. On the other hand, adaptive filters accommodate changes in local properties of the terrain backscatter as well as the nature of the sensor. In these types of filters, the speckle noise is considered as being stationary but the changes in the mean backscatters due to changes in the type of target are taken into consideration. Adaptive filters reduce speckles while preserving the edges (sharp contrast variation). These filters modify the image based on statistics extracted from the local environment of each pixel [6]. Adaptive filter varies the contrast stretch for each pixel depending upon the Digital Number (DN) values in the surrounding moving kernel. Obviously, a filter that adapts the stretch to the region of interest (the area within the moving kernel) would produce a better enhancement. Lee-Sigma, Lee, Gamma MAP, Frost are examples of such filters. Studying the effects of these filters are the subject of this paper therefore they are studied in a bit more detailed in the next section. 2.1. SPECKLE FILTERING As implicitly mentioned above, speckle filtering consists of moving a kernel over each pixel in the image and applying a mathematical calculation using the pixel values under the kernel and replacing the central pixel with the calculated value. The kernel is moved along the image one pixel at a time until the entire image has been covered. By applying the filter a smoothing effect is achieved and the visual appearance of the speckle is reduced. 2.1.1. Lee-Sigma and Lee Filters: The Lee-Sigma and Lee filters utilize the statistical distribution of the DN values within the moving kernel to estimate the value of the pixel of interest. These two filters assume a Gaussian distribution for the noise in the image data. The Lee filter is based on the assumption that the mean and variance of the pixel of interest is equal to the local mean and variance of all pixels within the user-selected moving kernel. The formula used for the Lee filter is [7]. 3

DN out = [Mean] + K[DN in - Mean] (2) where Mean = average of pixels in a moving window Var( x) Variance within window K = And Var (x) = 2 2 [ Mean] σ + Var( x) [ ] + [ Mean within window] 2 [ Sigma] + 1 2 [ Mean within window] 2 The Sigma filter is based on the probability of a Gaussian distribution. It is assumed that 95.5% of random samples are within a 2 standard deviation range. This noise suppression filter replaces the pixel of interest with the average of all DN values within the moving kernel that fall within the designated range [8]. 2.1.2. Gamma-MAP Filter: The Maximum A Posteriori (MAP) filter is based on a multiplicative noise model with non-stationary mean and variance parameters. This filter assumes that the original DN value lies between the DN of the pixel of interest and the average DN of the moving kernel. Moreover, many speckle reduction filters assume a Gaussian distribution for the speckle noise. However, recent works have shown this to be invalid assumption. Natural vegetated areas have been shown to be more properly modeled as having a Gamma distributed cross section. The Gamma-Map algorithm incorporates this assumption and its exact formula is the following cubic equation [9]: ) 3 I IIˆ2 + σ ( Iˆ DN ) = 0 Where I ) = sought value, I = local mean, DN = input value, σ = the original image variance. (3) The Gamma-MAP logic maximizes the a posteriori probability density function with respect to the original image. It combines both geometrical and statistical properties of the local area [9]. The filtering is controlled by both the variation coefficient and the geometrical ratio operators extended to the line detection [10]. 2.1.3. Frost Filter: The Frost filter replaces the pixel of interest with a weighted sum of the values within the nxn moving kernel. The weighting factors decrease with distance from the pixel of interest. The weighting factors increase for the central pixels as variance within the kernel increases. This filter assumes multiplicative noise and stationary noise statistics and follows the following formula: DN = k n n α t e α (4) where 2 2 2 α = ( 4 / nσ )( σ / I ), k = normalization constant, I = local mean, σ =local variance, σ = image coefficient of variation value, t = X-X 0 + Y-Y 0, and n = moving kernel size [11]. 3. SAR GEOMETRY The SAR viewing geometry refers to the geometry between the transmitted SAR pulse and ground targets. The main parameter in this geometry is local incidence angle, defined as the angle between the 4

SAR range vector and the surface normal to each terrain element imaged by the SAR. It can be observed that any slope on the specific ground surface being imaged will significantly alter the geometry of the signal-target interaction [2] [3]. A SAR is a distance-measuring device. The SAR system measures the time delay between transmission and reception of a pulse in order to determine where targets are relative to one another in the range direction. So when a satellite SAR is imaging a steep relief feature such as a mountain, the radar pulse could reach the top of the mountain first and the bottom of the mountain last. Thus, from the SAR s perspective, the top of the mountain is closer than the base of the mountain. When subsequently portrayed on a two dimensional image, the mountain appears to be leaning towards the sensor, resulting in the displacement of mountain tops and other topographic features from their true orthographic positions. These distortions can become quite large. These possible distortions are referred to as shadow, foreshortening, and layover. Shadows on a SAR image indicate those areas on the ground which has not been illuminated by the SAR signal. Foreshortening occurs when the local incidence angle is smaller than the illumination angle, but larger than 0. This type of distortion appears on an image as if the sensor-facing slope is shortened and the feature is leaning towards the sensor (hence the term foreshortening). Layover is an extreme form of foreshortening. For small incidence angles or very steep ground relief features, the backscatter often returns from the top of the feature before the base. This occurs where the local incidence angle is greater than incidence angle. On the SAR image, this appears as if the highest point of the vertical feature is laid over top of its base in the direction of the sensor. As with foreshortening, features that exhibit layover have very bright sensor-facing slopes [3]. 4. GEOMETRIC CORRECTIONS Radar systems are side-looking distance measuring systems, thus key geometric parameters are the incident angle, local incident angle and look direction. The side-looking geometry of radar results in several geometric distortions, such as slant range scale distortions and relief distortions. Geometric corrections include slant to ground range, registration, and local incident angle corrections (if topographic information is available). Generally speaking, geometric correction algorithms are classified into three methods: Slant to ground method Polynomial method (best fit approximations) Radargrammetric method (known sensor geometry) Ground Control Points (GCPs) are used to establish and/or refine the transformation. 4.1 Slant/Ground Range Conversion SAR data are acquired in slant range. Slant to ground range conversion is used to project the acquired image to the ground system. To do this, one needs to know (or assume) imaging geometry, platform altitude, range delay and terrain elevation. Resampling is used to give uniform pixel spacing (in ground range) across the image swath. Slant to ground range conversion can be done during signal processing or during image processing. Generally, it is applied after radiometric correction. Approaches and algorithms used are a function of analysis objectives. RADARSAT ground range products assume a sea level ellipsoid earth model with zero relief. 5

4.2. Image Registration Polynomial Transforms Polynomial transform uses a best-fit model. First order polynomial is a shift-rotation of the image, whereas the third order polynomial is a complex warping of the image. Second order polynomials are used for images requiring nonlinear warping. Third and higher order polynomials create a more complex image transformation. Higher order transforms require a greater number of ground control points (GCPs) in order to produce the transform model. High order does not guarantee higher accuracy. Higher order usually ties the image down at the GCPs, but can increase errors between the GCPs. 4.3 Radargrammetric Method Geocoding is the geometric correction of image data to a map projection. Traditional method of geocoding is the polynomial transform. This method does neither model the viewing geometry nor use elevation data to correct for topography. The most accurate geocoding method is the radargrammetric method. The radargrammetric process consists of three steps as following: Ephemeris modeling and refinement (if GCPs are provided) Sparse mapping grid generation Output formation (including terrain corrections) Radargrammetric method uses analytical formulation of the distortions during image formation. Therefore, the geometric correction is done using the platform (ephemeris and ancillary data), sensor (integration time, pulse length, depression angle), and DEM information. Output of radargrammetry is an Ortho-image corrected for all distortions, including relief. The planimetric accuracy of the final ortho-image is dependent on the accuracy of GCPs and the DEM [12][13]. The advantages of radargrammetric method are as following: Unified projection system. Direct image to terrain correction. Only one resampling of an image (slant range to map projection is directly done, no intermediate conversion to ground is required). Homogeneity in the ortho - image generation. Use of a DEM or a mean altitude. Better integration with GIS or digital maps. Comprehension and control of the full geometric process and of the resulting errors. 5. NUMERICAL RESULTS 5.1 Radiometric Correction The real imagery used for numerical experimentation is a 185 by 119 pixel raw SAR image from Death Valley located at longitude of 116 30 50 N and latitude of 36 36 30 W. The spatial resolution of this SAR image is 12.5 m (Figure 1). 6

Figure 1. Raw image with speckle noise To remove or reduce the speckle noise, all of above mentioned filters are used. These tests are used with two different filter size, i.e., 3x3 and 5x5. Table 1. shows the results. Figure 3 shows the ratio of Mean/Stdv which has been used as a criterion for the evaluation of the performance of the filters. Based on the results it is seen that the 5x5 Gamma-MAP filter has a better performance than the other filters. The filtered image is shown in Figure 2. Figure 2. Filtered image with Gamma-MAP Table 1. Statistical values for filtered images MEAN STDV Mean/STDV Raw image 6176.40 1757.45 3.51 Gamma-MAP 3x3 6122.49 1226.82 4.99 Frost 3x3 6137.27 1464.69 4.19 Lee 3x3 6138.48 1258.18 4.88 Lee-Sigma 3x3 6130.05 1266.56 4.84 Gamma-MAP 5x5 6085.75 944.56 6.44 Frost 5x5 6160.57 1181.99 5.21 Lee 5x5 6113.00 1003.80 6.09 Lee-Sigma 5x5 6090.83 997.54 6.11 7 6 5 4 3 2 1 0 raw image Gamma- MAP 3x3 Mean/STDV Frost 3x3 Lee 3x3 Lee-Sigma 3x3 Gamma- MAP 5x5 Frost 5x5 Lee 5x5 Lee-Sigma 5x5 Figure 3. Mean/Stdv values for the filtered image 7

5.3. Geometric Correction The above mentioned geometric correction shave been applied to the radiometrically corrected image. Figure 4 shows the corrected image. As the image is in the ground range format, there is no need to apply slant to ground range conversion. Different geometric correction methods are used to evaluate their performance. These methods include: Polynomial with 5 GCPs Geocoding using ephemeris Orthorectification using radargrammetry (using ephemeris, GCPs and DEM) Table 2, Figure 5 and 6 show the results. It is seen that radargrammetric method is more efficient than the other two geometric correction methods. In this method the relief distortion is eliminated because of using DEM. It is also seen that polynomial with 5 GCPs has better results than geocoding using ephemeris. Moreover, it is seen that RMS error in X direction is larger than the corresponding value in Y direction. The reason is that the relief distortion is more in X direction because of imaging geometry. Figure 6 shows the orthorectified image. It is clearly seen that the relief distortions are eliminated. Figure 4. Radarsat image of Death Valley, California. Pixel size 12.5m in range and azimuth direction. Acquisition date 1996-10-15, C Band.Image is in ground range Table 2. RMS Errors for Geometric correction methods Geometric Correction Method Geocoding with Orbit and Satellite Parameter with 5 check point Polynomial(with 5 GCPs) Orthorectification with DEM and 5 GCP X RMS Error(m) Y RMS Error(m) RMS Error(m) 896.5 208.0 920.3 317.4 19.6 318.0 41.8 28.9 50.8 8

1000 900 800 700 600 500 400 300 200 100 0 X RMS Error(meter) Geocoding with Orbit and Satellite Parameter with 5 check point Polynomial(with 5 GCPs) Orthorectification with DEM and 5 GCP Y RMS Error(meter) RMS Error(meter) Figure 5. RMS Errors for three geometric correction methods Figure 6.Orthorectified image with DEM and GCP 6. Conclusions and Remarks Radar remote sensing systems provide us relatively new source of information. Because of its unique capabilities in compare to optical systems, it is worth putting more effort to study the radiometric and geometric characteristics of them. Speckle noises are inherent part of radar images. There are different methods to reduce these speckle noises. This paper used four different filters with two different sizes. The results of numerical experimentation show that Gamma-MAP provides better results in compare to the other three filters used here. One of the reasons seems to be use of Gamma distribution for speckle noise by this filter. Geometric corrections in radar imageries are different than optical ones as the geometry of the imageries are different. Three different geometric corrections have been used here. The results show that radargrammetric method has a better performance in compare to the other two methods. The reason seems to be obvious as radargrammetry considers geometry of imaging, uses both orbital parameters of the sensor, and DEM of the region. Last but not least, it is suggested to repeat these tests with other radar imageries. 9

References 1. Lillesand, M.T., Kiefer, R.W., 2000. Remote Sensing and Image Interpretation. Fourth Edition. John Wiley & Sons, New York. 2. Henderson, F.M., Lewis, A.J., 1998. Principles and Application of Imaging Radar.Volume1, John Wiley & Sons Inc., New York. 3. Raney, R.K., 1998. Radar Fundamentals: Technical Perspective. Chapter 2 in Principles and Applications of Imaging Radar, Manual of Remote Sensing, Third Edition, Volume 2, ASPRS, John Wiley and Sons Inc., Toronto. 4. InfoSAR Ltd, 2006. InfoPACK User Guide Version 1.0, http://www.infosar.co.uk (accessed 27 Jan. 2006). 5. Durand, M.J., Gimonet, B.J., Perbos, J.R., 1987. SAR Data Filtering for Classification. IEE, GE25 (5), 629-637. 6. Lopes A., E. Nezry, R. Touzi, and H. Laur, 1993. Structure Detection and Statistical Adaptive Speckle Filtering in SAR Images. International Journal of Remote Sensing, Vol. 14, No. 9, pp.1735-1758 7. Lee, J.S., 1981. Speckle Analysis and Smoothing of Synthetic Aperture Radar Images. Computer Graphics and Image Processing, Vol. 17:24-32. 8. Lee, J. S., 1983. Digital Image Smoothing and the Sigma Filter. Computer Vision, Graphics and Image Processing, 24, 255 269. 9. Lopes A., Nezry, E., Touzi, R., and Laur, H., 1990. Maximum A Posteriori Speckle Filtering and First Order texture Models in SAR Images. International Geoscience and Remote Sensing Symposium (IGARSS). 10. Touzi, R., Lopes, A., Bousquet, P., 1998. A statistical and geometrical edge detector for SAR image. IEEE Transactions on Geoscience and Remote Sensing, Vol. 26, No. 6, pp. 764-773. 11. Frost, V.S., Stiles, J.A., Josephine, A., Shanmugan, K. S., and Holtzman, J.C., 1982. A Model for Radar Images and Its Application to Adaptive Digital Filtering of Multiplicative Noise. IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. PAMI-4, No. 2, March 1982. 12. Toutin, Th. and Y. Carbonneau, 1992. MOS and SEASAT Image Geometric Correction. IEEE Transactions on Geoscience and Remote Sensing, Vol. 30, No. 3, pp. 603-609 13. Toutin Th. and B. Rivard, 1997. Value Added RADARSAT Products for Geoscientific Applications", Canadian Journal of Remote Sensing, Vol. 23, Nol. 1, pp. 63-70, 14. ERDAS IMAGINE 8.5 Software, help document, 2001 10