Technology Brief: Scene Matching Area Correlation Technology using Millimeter Wave (MMW) Image

Similar documents
HALS-H1 Ground Surveillance & Targeting Helicopter

UNCLASSIFIED. UNCLASSIFIED R-1 Line Item #13 Page 1 of 11

Microwave Remote Sensing (1)

Fundamental Concepts of Radar

Remote Sensing. Ch. 3 Microwaves (Part 1 of 2)

Helicopter Aerial Laser Ranging

Microwave Remote Sensing

THE modern airborne surveillance and reconnaissance

ACTIVE SENSORS RADAR

Comparison of passive millimeter-wave and IR imagery in a nautical environment

Abstract. 1. Introduction

Target Range Analysis for the LOFTI Triple Field-of-View Camera

Active and Passive Microwave Remote Sensing

A bluffer s guide to Radar

Special Projects Office. Mr. Lee R. Moyer Special Projects Office. DARPATech September 2000

Introduction Objective and Scope p. 1 Generic Requirements p. 2 Basic Requirements p. 3 Surveillance System p. 3 Content of the Book p.

Phantom Dome - Advanced Drone Detection and jamming system

Chapter 2 Threat FM 20-3

MR-i. Hyperspectral Imaging FT-Spectroradiometers Radiometric Accuracy for Infrared Signature Measurements

Background. Computer Vision & Digital Image Processing. Improved Bartlane transmitted image. Example Bartlane transmitted image

Subsystems of Radar and Signal Processing and ST Radar

MR-i. Hyperspectral Imaging FT-Spectroradiometers Radiometric Accuracy for Infrared Signature Measurements

OVERVIEW OF RADOME AND OPEN ARRAY RADAR TECHNOLOGIES FOR WATERBORNE APPLICATIONS INFORMATION DOCUMENT

Lecture 1 INTRODUCTION. Dr. Aamer Iqbal Bhatti. Radar Signal Processing 1. Dr. Aamer Iqbal Bhatti

FLY EYE RADAR MINE DETECTION GROUND PENETRATING RADAR ON TETHERED DRONE PASSIVE RADAR FOR SMALL UAS PASSIVE SMALL PROJECTILE TRACKING RADAR

ARMY RDT&E BUDGET ITEM JUSTIFICATION (R-2 Exhibit)

WIRELESS LINKS AT THE SPEED OF LIGHT

Target Recognition and Tracking based on Data Fusion of Radar and Infrared Image Sensors

DLR Project ADVISE-PRO Advanced Visual System for Situation Awareness Enhancement Prototype Introduction The Project ADVISE-PRO

ASM(AR) Demonstration Engagements Anti-Ship Missile Active Radar Homing

THE NASA/JPL AIRBORNE SYNTHETIC APERTURE RADAR SYSTEM. Yunling Lou, Yunjin Kim, and Jakob van Zyl

Radar Systems.

Abstract. Aerial photographs are taken from a variety of altitudes. The altitude ranges are defined as follows:

ISTAR Concepts & Solutions

AIR ROUTE SURVEILLANCE 3D RADAR

RADAR CHAPTER 3 RADAR

Comparison of Two Detection Combination Algorithms for Phased Array Radars

Application. Design and Installation Variants

An Introduction to Geomatics. Prepared by: Dr. Maher A. El-Hallaq خاص بطلبة مساق مقدمة في علم. Associate Professor of Surveying IUG

Ronald Driggers Optical Sciences Division Naval Research Laboratory. Infrared Imaging in the Military: Status and Challenges

AIRSAM: A Tool for Assessing Airborne Infrared Countermeasures

Geo/SAT 2 INTRODUCTION TO REMOTE SENSING

Introduction to Microwave Remote Sensing

CHAPTER 1 INTRODUCTION

2 INTRODUCTION TO GNSS REFLECTOMERY

THREE DIMENSIONAL FLASH LADAR FOCAL PLANES AND TIME DEPENDENT IMAGING

Co-ReSyF RA lecture: Vessel detection and oil spill detection

Silent Sentry. Lockheed Martin Mission Systems. Jonathan Baniak Dr. Gregory Baker Ann Marie Cunningham Lorraine Martin.

«Integrated Air Defence Systems - Countering Low Observable Airborne Threats»

Active and Passive Microwave Remote Sensing

Fraunhofer Institute for High frequency physics and radar techniques FHR. Unsere Kernkompetenzen

Introduction to Remote Sensing

The EDA SUM Project. Surveillance in an Urban environment using Mobile sensors. 2012, September 13 th - FMV SENSORS SYMPOSIUM 2012

remote sensing? What are the remote sensing principles behind these Definition

Radar Equations. for Modern Radar. David K. Barton ARTECH HOUSE BOSTON LONDON. artechhouse.com

By Nour Alhariqi. nalhareqi

Reprint (R43) Polarmetric and Hyperspectral Imaging for Detection of Camouflaged Objects. Gooch & Housego. June 2009

Author s Name Name of the Paper Session. DYNAMIC POSITIONING CONFERENCE October 10-11, 2017 SENSORS SESSION. Sensing Autonomy.

Advanced Fusion Avionics Suite

A Review of Vulnerabilities of ADS-B

Miniaturized GPS Antenna Array Technology and Predicted Anti-Jam Performance

Military Radome Performance and Verification Testing Thomas B. Darling Vice President, Customer Support MI Technologies

Digital Image Processing - A Remote Sensing Perspective

Synthetic Aperture Radar

AFB OH Z XU ET AL 24 FEB 83 UAI FE FTD-ID(RS) T-i /2/2 N

Analysis of Satellite Image Filter for RISAT: A Review

Index 275. K Ka-band, 250, 259 Knowledge-based concepts, 110

MMW sensors for Industrial, safety, Traffic and security applications

International Journal of Scientific & Engineering Research, Volume 8, Issue 4, April ISSN Modern Radar Signal Processor

A new Sensor for the detection of low-flying small targets and small boats in a cluttered environment

F-104 Electronic Systems

Chapter 4. Meaconing, Intrusion, Jamming, and Interference Reporting

EW Self Protection Systems.

The DARPA 100Gb/s RF Backbone Program

Defense Technical Information Center Compilation Part Notice

Mission Solution 300

NEXTMAP. P-Band. Airborne Radar Imaging Technology. Key Benefits & Features INTERMAP.COM. Answers Now

IMAGE FORMATION THROUGH WALLS USING A DISTRIBUTED RADAR SENSOR NETWORK. CIS Industrial Associates Meeting 12 May, 2004 AKELA

Henry O. Everitt Weapons Development and Integration Directorate Aviation and Missile Research, Development, and Engineering Center

Training simulator of the operator Fighting vehicle ADMS «Strela-10»

Naval Surveillance Multi-beam Active Phased Array Radar (MAARS)

PROGRESS ON THE SIMULATOR AND EYE-TRACKER FOR ASSESSMENT OF PVFR ROUTES AND SNI OPERATIONS FOR ROTORCRAFT

AIRCRAFT AVIONIC SYSTEMS

Govt. Engineering College Jhalawar Model Question Paper Subject- Remote Sensing & GIS

Free Space Optical Communication System under Different Weather Conditions

RECONNAISSANCE PAYLOADS FOR RESPONSIVE SPACE

Passive Microwave Sensors LIDAR Remote Sensing Laser Altimetry. 28 April 2003

Assessing & Mitigation of risks on railways operational scenarios

The GLOBAL POSITIONING SYSTEM James R. Clynch February 2006

Computer simulator for training operators of thermal cameras

Important Missions. weather forecasting and monitoring communication navigation military earth resource observation LANDSAT SEASAT SPOT IRS

APPLICATIONS OF GPS. The Global Positioning System, while originally a military project, is considered a

RDT&E BUDGET ITEM JUSTIFICATION SHEET (R-2 Exhibit)

A Multilayer Artificial Neural Network for Target Identification Using Radar Information

SIMULATOR FOR OPERATOR OF ANTITANK GUIDED MISSILE «KORNET-E»

Deployment scenarios and interference analysis using V-band beam-steering antennas

Hyper-spectral, UHD imaging NANO-SAT formations or HAPS to detect, identify, geolocate and track; CBRN gases, fuel vapors and other substances

Passive Radars as Sources of Information for Air Defence Systems

COMPANY RESTRICTED NOT EXPORT CONTROLLED NOT CLASSIFIED Your Name Document number Issue X FIGHTING THE BATTLE. Thomas Kloos, Björn Bengtsson

TECHNOLOGY COMMONALITY FOR SIMULATION TRAINING OF AIR COMBAT OFFICERS AND NAVAL HELICOPTER CONTROL OFFICERS

Transcription:

Technology Brief: Scene Matching Area Correlation Technology using Millimeter Wave (MMW) Image AUG Signals scene alignment technology registers real-time MMW and reference MMW or other images of the same scene even if this scene has undergone significant changes. The scene matching technology can be used for navigation, reconnaissance and disaster management. AUG Signals has also developed airborne active MMW radar data based Automatic Target Recognition (ATR). 1. OPERATIONAL USES - BACKGROUND The millimeter-wave region of the electromagnetic spectrum is usually considered to be the range of wavelengths from 10 millimeters (0.4 inches) to 1 millimeter (0.04 inches). This means millimeter waves are longer than infrared waves or x-rays, for example, but shorter than radio waves or microwaves. The millimeter-wave region of the electromagnetic spectrum corresponds to radio band frequencies of 30 GHz to 300 GHz, and is sometimes called the Extremely High Frequency (EHF) range. The high frequency of millimeter wave, as well as their propagation characteristics, makes it useful for a variety of applications, including transmitting large amounts of computer data, cellular communications, and radar. Radar is an important use of millimeter waves, as it takes advantage of an important property of millimeter wave propagation called beamwidth. Beamwidth is a measure of how a transmitted beam spreads out as it gets farther from its point of origin. In radar, it is desirable to have a beam that stays narrow, rather than fanning out. Small beamwidths are good in radar because they allow the radar to see small distant objects, much like a telescope. A carefully designed antenna allows microwaves to be focused into a narrow beam, just like a magnifying glass focuses sunlight. Unfortunately, small beamwidths require large antenna sizes, which can make it difficult to design a good radar set that will fit, for example, inside a cramped airplane cockpit. Thankfully, the use of millimeter-length microwaves has allowed engineers to overcome this antenna size problem. For a given antenna size, the beamwidth can be made smaller by increasing the frequency, and so the antenna can be made smaller as well. The MMW radar has been applied to precision missile guidance for its high frequency, narrow beam-width with small antenna aperture, and well-developed integrated devices, especially in the active terminal guidance. While optical systems (visible and IR) require clear atmospheric conditions for reliable operation, MMW imaging is relatively immune to weather conditions such as cloud, fog, snow, and light rain. For example, the atmospheric attenuation in the range of millimeter wave frequencies is 0.07 to 3 db/km in drizzle and fog conditions, whereas it is one to three orders of magnitude higher at optical frequencies (exceeding 100 db/km in foggy conditions). MMW imaging has shown distinct advantages for the detection of terrestrial targets under optically obscuring conditions such as cloud, haze, snow, and light rain. Autonomous guidance, particularly missile seeker, is a key application area for scene matching area correlation technology using MMW imagery. Scene matching based guidance is used in the terminal stage of a missile flight; it matches high resolution real time captured images with reference images to correct for target measuring error, initial launching error, accumulated inertial error and re-entry error. The navigation error correction capability facilitates highly accurate strike capability of missiles and also enhances their ability of survival of a surprise

attack by creating favorable conditions for launching maneuverability and re-entry maneuverability. There are many examples of missiles that use MMW seeker technology for active guidance: Dual mode Brimstone missile This is an air-launched ground attack missile with range up to 60+km developed in the United Kingdom. The dual-mode version of Brimstone missile is equipped with a MMW radar seeker for target recognition which operates in the 94 GHz frequency range. The missile uses its radar to differentiate the valid and invalid targets. MMW radar can help the missile to search targets in assigned areas to minimize the risk to friendly forces. The missile destroys fast moving targets in cluttered environments by applying joint Semi-Active Laser (SAL) and MMW guidance. The targets can be destroyed employing fire-and-forget armor mode requiring no further input from the operator. The weapon was widely used by the Royal Air Force in Afghanistan, and more recently Libya, where it generated the interest of the US Air Force as well as the US Navy and others. An anti-boat version, called Sea Spear is being developed which will be particularly effective against swarm attack by fast patrol craft. Longbow HELLFIRE missile The Longbow HELLFIRE (AGM- 114L) is a precision strike missile using Ka band millimeter wave (MMW) radar guidance. It is the principal antitank system for the AH-64D Apache Longbow helicopter and has a range up to 8 km. The missile can also be launched from different platforms, such as rotary- and fixed-wing aircraft, waterborne vessels and land-based systems against a variety of targets, against variety of ground and maritime targets. The MMW seeker provides beyond line-of-sight fire-and-forget capability, as well as the ability to operate in adverse weather and battlefield obscurants. The Longbow system is mainly being developed for integration into the Apache attack helicopter and the Comanche armed reconnaissance helicopter. AGM-88E AARGM - Advanced Anti-Radiation Guided Missile (AARGM) is a tactical, air-to-surface missile designed to home in on surface-to-air radar systems. Developed by a joint venture of the Italian Ministry of Defense and the US Department of Defense the missile is an upgrade over Anti-Radiation Missile (HARM) missile. The AARGM features the enhanced capabilities intended to counter radar shutdown and passive radar using an additional active millimeter wave seeker. The primary users of this missile are the US Navy and Italian air force. AGM-158 JASSM Joint Air-to-Surface Standoff Missile (JASSM) is a long-range, airto-ground and precision standoff missile designed primarily for the US Air Force (USAF). The JASSM flies automatically through a predetermined route by using the onboard inertial navigation system that includes Anti-Jam Global Positioning System (AJGPS), and a ring laser gyro inertial measurement unit. Additional navigation and guidance features on the air-to-surface missile include an imaging infrared (I2R) seeker, and automatic target correlator (ATC) for high precision strike rate. It can also carry powered low cost autonomous attack system (LOCAAS) sub-munitions integrating a dual mode laser detection and ranging (LADAR) system and a MMW seeker. Joint Air-to-Ground Missile The missile, abbreviated as JAGM, is being developed for the US Army. The seeker features Hellfire semi-active laser and Longbow millimeter wave radar. The dual-mode guidance capability was recently demonstrated by engaging a laser-designated moving target. 2

Kh-25MAE / AS-10 Karen - The Kh-25MAE missile, built in Russia, employs an inertial navigation system and a Ka-band active radar seeker, similar to the seeker in AGM-114L, but with a narrower antenna scan angle and larger aperture. The missile is intended for attacks on a wide range of surface targets including vehicles, parked aircraft, helicopters, Command, Control and Communication (C3) targets, Petroleum, Oil, & Lubricants (POL) targets, and structures, under day, night and adverse weather conditions. Scene matching area correlation technology using MMW radar imagery is also widely applied in automatic navigation systems of manned and un-manned aircrafts. Alternatives like a GPS system can be easily jammed, vision-based navigation have limitations in rain, fog, snow and dust, and inertial navigation systems face the inherent limitation of error accumulation over time. MMW-based scene matching area correlation technology can accurately match real time target images to reference images and thus correct the accumulated navigation error. In static target recognition applications, such as recognition buildings and structures by missile seekers, AUG Signals scene alignment technology can be applied to align real-time MMW and reference MMW or other images of the same scene even if this scene has undergone significant changes. The scene matching technology can also be used in military applications like rotaryand fixed-wing aircraft navigation. Aligned images can also be used for change detection, which has military applications, such as reconnaissance, and civilian applications in agriculture and disaster management. AUG Signals also developed technology for Automatic Target Recognition (ATR) of mobile targets by missile seekers equipped with active MMW radar. 2. TECHNOLOGY DESCRIPTION Two types of MMW image matching technologies are described in the following: scene alignment technology for identifying specific structures and for navigation of EO rotary- and fixed-wing aircraft; and Automatic Target Recognition (ATR) technology for missile seekers homing on mobile targets. 2.1 IMAGE ALIGNMENT TECHNOLOGY The advanced signal processing algorithms, discussed below, provide the capability to accurately align a target MMW image with reference imagery and, as a result of this alignment, compute the transformation between a real-time image and reference image. The target image is a real time image, while the reference image is a previously recorded MMW, SAR or optical (visible and infrared) image. In addition, reference images can include extracted feature information from MMW, SAR or optical images. The algorithms used can be adapted to reflect the different types of prior reference images, as required by specific operational scenarios. 3

Figure 1. Illustration of Image Alignment technology. Reference imagery is extracted from a database using estimated radar coverage information, which includes coverage resolution, approximate centre and size, and estimated error in radar coverage information. Depending on the region of uncertainty of the radar coverage, the reference image can be a single image in the database or can be composed of multiple images in the database. AUG Signals provides different feature extraction options for generation of reference images, including Maximum Strength Pruned Ratio of Averages and contour detection-based edge detection, geometric feature image, image histograms, invariant features, Gabor and Tamura texture features, local features and region-based features. Either preprocessed MMW images or extracted feature information from MMW, SAR images, Electro-Optical images, Digital Elevation Model (DEM) or other topography data can be used as reference imagery. The processing chain for MMW image alignment includes image preprocessing and image matching. Two tasks are performed in the preprocessing stage are geometric rectification and image enhancement. The goal of geometric rectification operation is to remove geometric, terrain and sensor distortions where possible. Aircraft/missile navigation information, which includes estimated position, altitude and orientation, and MMW radar-aircraft geometry are used in this case. Image enhancement procedures, such as denoising and deblurring, are applied to MMW for a more effective image alignment. The preprocessing stage ensures improved missile/aircraft navigation and missile homing. 4

Next, a reference image is selected based on the target image coordinates and AUG Signals matched points identification & relative transformation estimation is performed between the target image and the reference image, which generates a set of points matched between the target-reference pair. The match information is used to determine relative transformation information, which is critical for navigation and missile homing. The matched points identification & relative transformation estimation is a unique capability of AUG Signals that delivers the following advantages: Rotation and scaling factors between the two images, in addition to translation; Sub-pixel accuracy in the RMS error sense; Inherent ability to avoid local regions that do not match in target and reference images when a strong match is present elsewhere. For image matching, AUG Signals offers different options including: Area matching/correlation options sum of absolute differences, sum of squared differences, cross-correlation coefficient, mutual information, sequential similarity detection algorithm and coarse to fine search. Relative Transformation options linear conformal transformation, affine transformation, projective transformation and polynomial transformation. 2.2 AUTOMATIC TARGET RECOGNITION (ATR) TECHNOLOGY An end-to-end ATR system used in operational environment requires four stages: image enhancement, target detection, target segmentation and target recognition. Images are preprocessed in the first stage for the best performance in the next three stages. In the target detection stage non-target clutter in MMW images is rejected and potential targets are separated. Next, the detected target locations are clustered and segmented to obtain separate suspected targets. Target recognition is performed in the last stage of ATR processing. The technology can be used for developing a comprehensive image exploitation system resulting in onboard robust ATR for MMW missile seeker. The first stage of ATR is image enhancement. AUG Signals applies two major image enhancement technologies: noise reduction & smoothing, and image deblurring. The second stage of ATR detects a set of targets from the enhanced MMW images. AUG Signals applies cutting edge Constant False Alarm Rate (CFAR) technologies applicable for different scenarios of target detection. A CFAR detector detects the targets while keeping the false alarm rate under a user defined threshold. Assuming the distribution of the background, a threshold is selected based on the user defined false alarm rate. The CFAR detection is an adaptive process. To maintain the overall false alarm rate under a certain threshold, sets of pixels are examined separately. The thresholds are calculated for each pixel. For a specific pixel, the first step is to estimate the surrounding background distribution. Two windows are used to define the foreground and background pixels. The window with 5

magenta border is called the target window, and the outer box is called the background window. The pixels between these two windows are background pixels used to estimate the background distribution to calculate the threshold for the pixels under examination. Different CFAR detection options provided by AUG Signals include: Model-based Single CFAR detector Model-based CFAR detector uses probability density function (pdf) models, such as Gaussian, Cauchy, exponential, Weibull and Gamma, for clutter pixel intensity. The parameters of the pdf model are selected based on the selected background pixels as estimations of the local background pdf. Multi-CFAR detector Unlike single CFAR detectors, the multi-cfar detector uses several CFAR detectors to perform detection on the same data, and their decisions are fused using specific rules to obtain a final detection decision. The combination of CFAR detectors is able to provide complementary information which results in higher detection performance compared to any single detector, while maintaining a constant false alarm rate. Markov-chain CFAR detector AUG Signals Markov-chain detection rule is based on the likelihood ratio of the transitions to the observed value for alternate hypothesis (target present) and null hypothesis (target absent). Figure 2. Illustration of ATR processing 6

In the segmentation stage of ATR the output of CFAR detector is clustered to separate targets present. Clustering of the targets is achieved using morphological filters. The output of this stage is a number of segmented targets. After target segmentation the next step in ATR is finding the specific targets of interest from all targets detected in the scene. Only by accurate recognition of targets critical military operations, such as homing of fire-and-forget missile, can be achieved. Sensors, like MMW, that can differentiate between targets and a priori knowledge of target signature are critical in this stage. The role of target recognition is to accurately identify target of interest even in the presence of variability in target signatures. The variability in target signature results from a number of factors. Ground clutter and error in target segmentation contributes to this variability. In addition, variability in target signature is also caused by unknown target orientation (pose). Target orientation is an important parameter (often unknown) in the case of target recognition. AUG Signals applies regular resampling and fixed resampling procedures to address unknown target orientation. Feature extraction is the next step of target recognition. AUG Signals uses a large selection of features that are effective in different scenarios, including wavelet transform, 2D Cepstrum, elliptical Fourier descriptors and magnitude of Fourier transform feature. If required, feature dimension is reduced using Principal Component Analysis (PCA) procedure. Target recognition procedure uses the selected and dimension reduced features, obtained through the above discussed procedures, to indentify the target of interest from other targets. A training set is used to configure target recognition procedure. AUG Signals employs four very effective target recognition procedures: K Nearest Neighbor (k-nn) Algorithm, support vector machine, AdaBoost and binary decision tree. 3. GRAPHICAL USER INTERFACES (GUIs) AUG Signals MMW image matching technologies can be explored using two different types of GUI: MMW image alignment GUI and Automatic Target Recognition (ATR) GUI. 3.1 MMW Image Alignment GUI Figure 3 shows the GUI that provides a test bench for MMW image alignment discussed in Section 2.1. Algorithm options discussed in Section 2.1 can be selected and evaluated using the GUI. The interface can be divided into three columns: 1. Input processing The left column of the GUI includes image input and preprocessing options. Input options for target and reference images include raw images, and most of the standard image formats are supported. Preprocessing options are provided separately for target and reference images. 2. Image display The middle column of the GUI displays input, preprocessed and texture versions of the images. In addition, feature extraction options are provided that will operate on both reference and target images. The Show Image buttons in the input 7

processing column control the input and reference image display version, which is either original, preprocessed or texture. 3. Output Generation The right column of the GUI provides options for target and reference image matching, displays the aligned images, and displays the estimate of relative image transformation. Scene matching options include similarity measure, matching method and relative transformations. Figure 3: GUI for the MMW image alignment. In order to streamline operations, the GUI provides options to save a sequence of commands for use on a different set of input files. It also provides zoom in/out and data cursor options for displayed images. 3.2 Automatic Target Recognition (ATR) GUI Figure 4 shows the GUI that provides a test bench for ATR discussed in Section 2.2. Algorithm options discussed in Section 2.2 can be selected and evaluated using the GUI. The interface can be divided into two columns: 1. Classifier Training The left column includes training data input, target of interest selection, target enhancement, feature extraction and training options. At each processing stage the output can be displayed and the trained classifier can be saved for directly loading it when it is required for testing. 2. Classifier Test The right column includes test data input, image enhancement, target detection, target segmentation and target recognition steps. Two types of data source are available: single file and multiple files. Single file provides an image that may contain multiple targets. Target detection and segmentation are required to separate the targets. When input is multiple files it is assumed that targets are already detected and segmented. 8

In order to streamline the operations, the GUI provides options to save a sequence of commands and open it to use on a different set of input files. Figure 4: GUI for the Automatic Target Recognition. 9