Wide-Area Motion Imagery for Multi-INT Situational Awareness

Similar documents
Wide-area Motion Imagery for Multi-INT Situational Awareness

Applying Multisensor Information Fusion Technology to Develop an UAV Aircraft with Collision Avoidance Model

OFFensive Swarm-Enabled Tactics (OFFSET)

NEXTMAP. P-Band. Airborne Radar Imaging Technology. Key Benefits & Features INTERMAP.COM. Answers Now

Applying Multisensor Information Fusion Technology to Develop an UAV Aircraft with Collision Avoidance Model

RECONNAISSANCE PAYLOADS FOR RESPONSIVE SPACE

Adaptation and Application of Aerospace and Defense Industry Technologies to the Oil and Gas Industry

UNCLASSIFIED. InnoVision Overview. Theron Anders 16 April 2008 Precision Strike Annual Programs Review UNCLASSIFIED

Countering Weapons of Mass Destruction (CWMD) Capability Assessment Event (CAE)

Hyper-spectral, UHD imaging NANO-SAT formations or HAPS to detect, identify, geolocate and track; CBRN gases, fuel vapors and other substances

HALS-H1 Ground Surveillance & Targeting Helicopter

A Technical Perspective on Cognitive Architectures

CENTRA Technology, Inc. V 1.2

Advanced Fusion Avionics Suite

Tailored Tactical Surveillance

Model-Based Design for Sensor Systems

Networked Targeting Technology

MASSACHUSETTS INSTITUTE OF TECHNOLOGY LINCOLN LABORATORY 244 WOOD STREET LEXINGTON, MASSACHUSETTS

Targeting a Safer World

Special Projects Office. Mr. Lee R. Moyer Special Projects Office. DARPATech September 2000

Urban Operations, The New Frontier for Radar

AFRL. Technology Directorates AFRL

Jager UAVs to Locate GPS Interference

ISTAR Concepts & Solutions

Discoverer II Space Based Radar Concept

Combining Air Defense and Missile Defense

Improving Performance through Superior Innovative Antenna Technologies

New and Emerging Technologies

An Agent-based Heterogeneous UAV Simulator Design

Radar Systems.

The C2/C4ISR Systems Market

Unmanned Maritime Vehicle (UMV) Test & Evaluation Conference

Phantom Dome - Advanced Drone Detection and jamming system

Copyright 2016 Raytheon Company. All rights reserved. Customer Success Is Our Mission is a registered trademark of Raytheon Company.

Combining Ground Radars with Imaging Multisensors

Ronald Driggers Optical Sciences Division Naval Research Laboratory. Infrared Imaging in the Military: Status and Challenges

UNCLASSIFIED FY 2016 OCO. FY 2016 Base

Striker II. Performance without compromise

Range Instrumentation Radar Roadmap. Tim Boolos Ira Ekhaus Mike Kurecki BAE Systems Instrumentation Products and Sustainment

The Next Generation of Covert Antennas

Reprint (R43) Polarmetric and Hyperspectral Imaging for Detection of Camouflaged Objects. Gooch & Housego. June 2009

Intelligence Automation Using WAMI for Counterinsurgency Applications

The EDA SUM Project. Surveillance in an Urban environment using Mobile sensors. 2012, September 13 th - FMV SENSORS SYMPOSIUM 2012

Unmanned/Robotic Systems

MR-i. Hyperspectral Imaging FT-Spectroradiometers Radiometric Accuracy for Infrared Signature Measurements

Customer Showcase > Defense and Intelligence

RAND S HIGH-RESOLUTION FORCE-ON-FORCE MODELING CAPABILITY 1

APPENDIX H IMAGERY INTELLIGENCE SUPPORT TO LOW-INTENSITY CONFLICT

Overview. Objectives. The ultimate goal is to compare the performance that different equipment offers us in a photogrammetric flight.

MR-i. Hyperspectral Imaging FT-Spectroradiometers Radiometric Accuracy for Infrared Signature Measurements

Use of Communications EW in a Network Centric Warfare Environment

UNCLASSIFIED R-1 ITEM NOMENCLATURE. FY 2014 FY 2014 OCO ## Total FY 2015 FY 2016 FY 2017 FY 2018

ARMY RDT&E BUDGET ITEM JUSTIFICATION (R2 Exhibit)

Defense Advanced Research Projects Agency (DARPA)

A Nuclear Plume Detection and Tracking Model for the Advanced Airborne Early Warning Surveillance Aircraft

MOD(ATLA) s Technology Strategy

Advanced Technologies Group programs aim to improve security

2018 HSS Development

Author s Name Name of the Paper Session. DYNAMIC POSITIONING CONFERENCE October 10-11, 2017 SENSORS SESSION. Sensing Autonomy.

Covert Tunnel Detection Technologies

Wide Area Wireless Networked Navigators

RDT&E BUDGET ITEM JUSTIFICATION SHEET (R-2 Exhibit) February 2002

Virtual Reality Devices in C2 Systems

Chapter 2 Threat FM 20-3

The RCAF S&T program and the All Domain

Electronic Warfare (EW) Principles and Overview p. 1 Electronic Warfare Taxonomy p. 6 Electronic Warfare Definitions and Areas p.

Countering Weapons of Mass Destruction. S&T and Architecture for Loose Nukes

CMRE La Spezia, Italy

Content-Based Multimedia Analytics: Rethinking the Speed and Accuracy of Information Retrieval for Threat Detection

Traffic Management for Smart Cities TNK115 SMART CITIES

Engineering Project Proposals

Integrated Detection and Tracking in Multistatic Sonar

2012 PRODUCT INFORMATION SHEET. EagleEye 350 Multi-Role, Special Missions Aerial Surveillance Platform

Module 3 Introduction to GIS. Lecture 8 GIS data acquisition

PEGASUS : a future tool for providing near real-time high resolution data for disaster management. Lewyckyj Nicolas

Early Design Naval Systems of Systems Architectures Evaluation

During the next two months, we will discuss the differences

INNOVATIVE SPECTRAL IMAGING

A COMPUTER VISION AND MACHINE LEARNING SYSTEM FOR BIRD AND BAT DETECTION AND FORECASTING

Deployment and Testing of Optimized Autonomous and Connected Vehicle Trajectories at a Closed- Course Signalized Intersection

ACOUSTIC RESEARCH FOR PORT PROTECTION AT THE STEVENS MARITIME SECURITY LABORATORY

Safeguards in a Big Data World

Analysis Of Servomechanisms For Control Of Electro- Optical Surveillance Telescopes By Northrop READ ONLINE

Next Generation Information Awareness

Reconnaissance Payloads for Responsive Space

Introduction to Remote Sensing Fundamentals of Satellite Remote Sensing. Mads Olander Rasmussen

Autonomy Technology Research Center Collaboration with Air Force Research Laboratory Sensors Directorate and Wright State University

Applied Robotics for Installations and Base Operations (ARIBO)

Technologies Worth Watching. Case Study: Investigating Innovation Leader s

Real-Time Spectrum Monitoring System Provides Superior Detection And Location Of Suspicious RF Traffic

Vehicle-to-X communication using millimeter waves

Lesson 17: Science and Technology in the Acquisition Process

Hue-saturation-value feature analysis for robust ground moving target tracking in color aerial video Virgil E. Zetterlind III., Stephen M.

CHALLENGES FOR VEHICULAR VISION SYSTEMS WHITE PAPER WP Rev /03/2014 Commercial-in-Confidence

UNCLASSIFIED R-1 ITEM NOMENCLATURE FY 2013 OCO

Autonomous Mobile Robot Design. Dr. Kostas Alexis (CSE)

UNCLASSIFIED FY 2016 OCO. FY 2016 Base

NET SENTRIC SURVEILLANCE BAA Questions and Answers 2 April 2007

MMW communication for High-altitude,

Autonomous Control for Unmanned

Transcription:

Bernard V. Brower (U.S.) Jason Baker (U.S.) Brian Wenink (U.S.) Harris Corporation Harris Corporation Harris Corporation bbrower@harris.com JBAKER27@harris.com bwenink@harris.com 332 Initiative Drive 800 Lee Road 800 Lee Road Rochester, NY 14624 Rochester, NY 14606 Rochester, NY 14606 1. ABSTRACT With just over 10 years of operational use, wide-area motion imagery (WAMI) is still an emerging technology as compared to many of the intelligence, surveillance, and reconnaissance (ISR) sensing technologies. WAMI is described in the following two primary workflows: real-time situational awareness missions and collection for post processing and forensic analysis. The situational awareness mission is similar to how a full-motion video (FMV) sensor might be used, but with the added benefit of viewing multiple places simultaneously and supporting multiple users in real time. A collection for post-processing and forensic analysis enables users to exploit information from events that might not have been watched in real time or missed if using only an FMV. This mission also enables users to characterize trends and analyze networks that cannot be seen with other ISR assets. As ISR platforms move toward multi-sensor/multi-intelligence (multi-int) capabilities, WAMI can provide the situational awareness that provides the big-picture context by merging these information sources and moving from data to information to decisions. Without WAMI, today s multi- INT solutions struggle to cover the same area at the same time, resulting in missed opportunities in both real-time situational awareness and post-mission forensic analysis. This paper describes two examples of how WAMI and other sensors being used today will increase the quantity and accuracy of the intelligence reaching the user. In the first example, using analytics and WAMI an automated trip wire starts the tracking of a vehicle that is then scanned with a hyperspectral imaging (HSI) sensor to detect a target signature in real time. In the second example, a signal intelligence (SIGINT) detection instantiates a new region of interest (ROI) based on the location. As the SIGINT detection s geolocation improves with time, the ROI focuses on that location. At the same time, the WAMI sensor is tracking all vehicles that may be associated with that SIGINT detection to correlate to the possible movers within the probable region of the signal emitter. This paper will expand on these demonstrations by further describing the functionality and capability of WAMI used on a multi-int platform. This technology enables users to perform multi- INT exploitation at a high level in real time and post-mission forensic analysis. STO-MP-SET-241 12B-3-1

2. INTRODUCTION WAMI HISTORY (1) As intelligence, surveillance, and reconnaissance (ISR) capability, wide-area motion imagery (WAMI) has only been operational for just over 10 years. WAMI technology started with a 2002 Department of Energy program at Lawrence Livermore National Laboratory. The Sonoma Persistent Surveillance Program used multiple commercial sensors and was designed specifically for nonproliferation applications. This Sonoma Persistent Surveillance program demonstrated the value and possibilities of WAMI technology and was transitioned to the Department of Defense (DoD). The first transition to an operational system was the Constant Hawk System that deployed to Iraq in 2006. Constant Hawk was a Quick Reaction Capability that was focused on counter improvised explosive device (IED) applications. The first deployment lacked a downlink so only post-mission analysis could be performed after the sensor landed. The persistent surveillance and the large-area coverage allowed analysts to play back events and follow possible suspects to where they came from and where they went to. This also allowed users to evaluate possible networks of people and places of interest. This first deployment of a persistent surveillance system proved the value of the system that gave the area coverage and persistence required to combat the urban insurgencies. Since the conclusion of Constant Hawk, the techniques available for leveraging computer vision, tracking, and computer hardware capability have evolved. The Air Force Research Laboratory and Los Alamos National Laboratory developed the Angel Fire system used to support the Marines in Iraq. Angel Fire (later Blue Devil) included an airborne data link system and onboard processing that allowed the system to stream regions of interest (ROIs) to the ground or be viewed by onboard. This system showed the power of real-time situational awareness over a large area: for an entire field of view, multiple ROIs could be zoomed in on and scrolled across. That same year, the Air Force s Big Safari Program Office started the development of the Gorgon Stare sensor for the MQ-9 Reaper Unmanned Aircraft System (UAS). This system performed both the forensic mission and the real-time situational awareness mission from a UAS that could stay in a location for more than 10 hours. The system was deployed in Afghanistan in 2011. The most recent version has the largest day/night area coverage available with resolution better than 1 meter. Figure 1 shows an example of the large-area coverage and quality resolution of the sensors used on the Gorgon Stare system. Figure 1: Gorgon Stare (Increment 2) MWIR COP and Associated 1k x 0.6k ROI (2). 12B-3-2 STO-MP-SET-241

2.1 WAMI Capabilities and Missions The two current types of missions for WAMI are forensic analysis and real-time situational awareness. These missions continue to evolve as the sensor capabilities improve and processing technology advances. Detailed analysis of events, patterns of life, and networks are all part of the forensic analysis capability that has been developed since the start of WAMI missions. The forensic capabilities provided by WAMI are unmatched by other sensor types and support activity-based intelligence. The wide-area coverage and persistence allows analysts to see events that are happening concurrently and establish interconnected patterns of life, including social interactions, destinations, and origins of travel. Collecting WAMI data over time provides analysts the opportunity to: Observe vehicle tracks and traffic Study patterns of life Identify nodes of activity Identify anomalous behavior Utilize patterns and trends to anticipate behavior These forms of intelligence have become valuable to both tactical and Figure 2: Average Speed of cars over Rochester NY. strategic planning. One of the challenges for WAMI data is the amount of data and amount of people power required to properly exploit the data. As technology continues to improve, more of this work is being done with automated trackers and tool sets that produce derived information from these tracks. Figure 2 shows the average speed of vehicles over Rochester, NY. This forensic tool allows the user to develop a baseline for a given time and day and show what is the average speed, the average number of cars, and associated variances. Once these normalcy and patterns of life are established, users can look for things that are outside of normal. For example, if a car is speeding at greater than three times the variance in a location, there may be an issue. If a vehicle stops on a highway where vehicles do not commonly stop, then they are either in trouble or they are planning on doing something bad. The other baseline capability of WAMI is the improved real-time situational awareness that comes with multiple ROI. The ability to downlink the entire data is not common because of the amount of data being collected and the limitations of the downlink. WAMI systems commonly have the ability to downlink between 2 and 40 ROIs. The number of windows is mainly limited by the downline bandwidth. Each ROI could take between 0.5 megabits/second to 1.5 megabits/second depending on the size of the ROI (SD, 720p, and 1080p) and the compression ratio selected. Each user s ROI can be controlled (zoom/scroll) within the circle of persistence (COP), so that users feel like they have a UAS drone taking pictures just for them. These ROIs are streamed from the plane/uas to the ground station in standard formats that can be viewed with any FMV player. Figure 3 shows an example of 10 ROI (around the edge) streamed from the full field of view. These ROI can be in different locations, at different GSDs, and of different sizes, and operators can move them to the desired location. STO-MP-SET-241 12B-3-3

Figure 3: Example of streaming multiple ROIs from a WAMI system. 2.2 WAMI Advanced Capabilities (Tracking) The most common addition to recent WAMI systems is the ability to do real-time tracking on board. Most recently Harris has incorporated the BAE Track Analytic Software Suite (TASS) into our CorvusEye sensor for the real-time tracking capability. Real-time tracking enables more advanced features, like virtual watch boxes and trip wires. The user can have a ROI that follows a vehicle without requiring a human to interactively follow the vehicle. Onboard tracking also enables the more advanced concepts of detection of activity and events that are outside of normalcy. Analysis of track data is performed to detect events. The user can define a watch box (geolocation) where a track analytic module will process the data to evaluate if an event has occurred. At the simple end of the analytic software is a trip wire. The trip wire can be limited by the direction, speed, or size of the object being tracked. Virtual fences can be placed to evaluate if someone has entered or exited, or if there is any activity within the fence. Again, all of these events can be limited by direction, speed, and size of the object being tracked. More advanced track analytics include the detection of specific track types (right turn, left turn, stop, start, U-turns), abnormal speed detection (too slow, too fast), check point avoidance, meet check (two cars meet in a given location), and convoy detection (multiple cars following each other). 12B-3-4 STO-MP-SET-241

Figure 4: Example of vehicles being tracked and displayed in a ROI. 3. USING WAMI AS THE SITUATIONAL AWARENESS FOR MULTI-INT SOLUTIONS Today, many of the WAMI systems are combined on platforms with other sensors but most commonly with full-motion video (FMV) system. This allows users not only to get the big picture with the WAMI system, but also to investigate an area of interest at higher resolution and higher frame rate. Since these are independent, the user never loses the over-watch situational awareness of the WAMI while they are investigating the event at high resolution. Harris has demonstrated two more advanced concepts using WAMI as the situation awareness for a multi-int solution. The first is using cross-cueing of hyperspectral imaging (HSI) to perform material detection on subjects of interest. The second is to use the WAMI and associated tracks to a SIGINT collection system. 3.1 WAMI Cross-Cue of HSI Sensor The concept of this application is where the WAMI system is used to detect an event that is then further investigated by the HSI system. For the purposes of this paper, we will use the mission of event protection as an example. The WAMI sensor is deployed over the city of Rochester, NY, hours before a large sporting event. Figure 5 shows the overview of the WAMI system over the city, with the baseball stadium near the middle of the circle of persistence. The blue boxes show where there are different watch boxes looking for different activities. STO-MP-SET-241 12B-3-5

Figure 5: WAMI overview with watch boxes that are looking for different activities. An activity is detected on top of a parking structure that is supposed to be off-limits before the event at the stadium. Figure 6 shows the detection of movement on top of parking structure. The HSI system is then cross-cued to detect possible materials of interest. Figure 6: ROI over the stadium and the detection of movement in a restricted area. The vehicle is scanned with a longwave HSI system, which detects the release of a gas from the vehicle. Figure 7 depicts the detection of the gas as a function of time, showing the dispersion of the gas. For this demonstration, the gas was diflouroethane (dust-off aerosol). If there were more dangerous materials, such as those associated with improvised explosives or chemical weapons, action could be taken quickly because of the multi-int capability. 12B-3-6 STO-MP-SET-241

Figure 7: The detection of gas being released from the car window in a restricted area. The cross-cue capability can be used to cue multiple types of sensors. The most common today is the FMV sensor, but emerging uses include multi-spectral sensing, HSI, synthetic aperture radar, or LIDAR. Also, since the WAMI data is geolocated, cross-cueing to and from ground sensors can be easily achieved. 3.2 Correlation of WAMI Data and SIGINT Detections SIGINT sensors are very good at detecting and identifying specific targets, but are not good at geolocating those targets. The first detection shows the location of a target, with 95% confidence, is within the large-confidence ellipse. As the emitter is detected multiple times from different locations, the ellipse improves and decreases in size. Figure 8 shows this ellipse (blue ellipse, with the most likely target location identified) at two different times where the confidence goes from hundreds of meters to approximately 10 meters. Once the detection gets to a reasonable size, a new ROI can be instantiated to cover the ROI so that a user can have a view of the emitter s location. As the location confidence continues to improve (the error ellipse reduces in size), the ROI can automatically zoom into that location. This allows the user to quickly identify possible locations of the SIGINT activity. Figure 8: SIGINT Confidence ellipses with the first detection (left) and after multiple detections have improved the confidence in the location of the target. If the SIGINT target is coming from a moving vehicle, the problem becomes more difficult. To correlate the moving target data and the returns from the SIGINT sensor, we use a BAE tool called Hydra. Hydra utilizes a Bayesian framework to perform probabilistic correlation and fusion of disparate multi-sensor moving intelligence (MOVINT) sources, with the goal of providing a more accurate and complete estimate of target kinematics (position, velocity, and heading) than with one data source alone. Targets are tracked more consistently, more accurately, and longer by STO-MP-SET-241 12B-3-7

combining the observations of multiple sensors into a single fused track. Figure 9 shows an example of multiple tracks being eliminated as possible targets through the correlation of the SIGINT information and the track geolocation at different times. Figure 9: The movers that have stayed within the SIGINT Confidence ellipses over time are colored Red while the vehicles that are no longer have a possibility of being the target are turned to blue. The ability of WAMI to watch and track a large area significantly improves the situational awareness of a SIGINT collection and provides context. Also, the high resolution data of the WAMI system is being collected of the target area even before the first SIGINT detection. As the detection improves, the geolocation accuracy of the WAMI data can be viewed for that given time. But it also can go back in time for a forensic analysis of the possible target. The correlation of the SIGINT to moving vehicles enables the user to increase the accuracy of the emitter detection location from an ellipse of confidence to possibly a given vehicle. 4. CONCLUSION WAMI sensors continue to evolve and increase their performance and capabilities for supporting situational awareness and forensic analysis. The addition of automated tracking has enabled a new set of track analytics and activity-based intelligence. The large-area coverage and persistence makes WAMI the ideal situational awareness and base for different multi-int collection systems. Everything from FMV cross-cueing to more complicated correlation of SIGNIT with imagery and derived track information. Harris has been focused on developing, automating, and demonstrating these multi-int capabilities. The addition of dimensionality of the data with temporal, spectral, and multi-int data enables the automation of the analysis of this data. This reduces the cost and time of exploiting extensive amounts of data as sensor capabilities grow. We have demonstrated two different multi-int technologies concepts in this paper. In the future, we expect to evaluate how commercial data (e.g., social media, traffic cameras, 911 alerts) can be used to improve the situational awareness. 5. REFERENCE [1] Colucci, Frank. "Persistence On Patrol". Avionics Magazine, 1, May 2013 [2] MacEachin, J., Janosky, J., Optical Design Considerations for Wide Area Imaging Systems, Imaging and Applied Optics 2016, (Optical Society of America, 2016), paper ITh2F.2. 12B-3-8 STO-MP-SET-241