Prospects for Dynamic ISR Tasking and Interpretation Based on Standing Orders to Sensor Networks

Similar documents
Chapter 2 Threat FM 20-3

OFFensive Swarm-Enabled Tactics (OFFSET)

Evaluation of Connected Vehicle Technology for Concept Proposal Using V2X Testbed

Autonomous Tactical Communications

CONVERGENCE BETWEEN SIGNALS INTELLIGENCE AND ELECTRONIC WARFARE SUPPORT MEASURES

2006 CCRTS THE STATE OF THE ART AND THE STATE OF THE PRACTICE. Network on Target: Remotely Configured Adaptive Tactical Networks. C2 Experimentation

ARMY RDT&E BUDGET ITEM JUSTIFICATION (R2 Exhibit)

Situational Awareness Architectural Patterns

The EDA SUM Project. Surveillance in an Urban environment using Mobile sensors. 2012, September 13 th - FMV SENSORS SYMPOSIUM 2012

Engineering Project Proposals

Army Acoustics Needs

Conference panels considered the implications of robotics on ethical, legal, operational, institutional, and force generation functioning of the Army

In cooperative robotics, the group of robots have the same goals, and thus it is

WOLF - Wireless robust Link for urban Forces operations

ISTAR Concepts & Solutions

Real-Time Spectrum Monitoring System Provides Superior Detection And Location Of Suspicious RF Traffic

Understanding DARPA - How to be Successful - Peter J. Delfyett CREOL, The College of Optics and Photonics

By Pierre Olivier, Vice President, Engineering and Manufacturing, LeddarTech Inc.

A Comparative Study on different AI Techniques towards Performance Evaluation in RRM(Radar Resource Management)

Countering Weapons of Mass Destruction (CWMD) Capability Assessment Event (CAE)

Improving Emergency Response and Human- Robotic Performance

Situation Awareness in Network Based Command & Control Systems

Evolution of Sensor Suites for Complex Environments

ASSAULT RIFLE SIMULATOR

Distributed Virtual Environments!

Q. No. BT Level. Question. Domain

Sniper Localization using a Helmet Array

Special Projects Office. Mr. Lee R. Moyer Special Projects Office. DARPATech September 2000

During the next two months, we will discuss the differences

Semi-Autonomous Parking for Enhanced Safety and Efficiency

Design of Simulcast Paging Systems using the Infostream Cypher. Document Number Revsion B 2005 Infostream Pty Ltd. All rights reserved

Concordia University Department of Computer Science and Software Engineering. SOEN Software Process Fall Section H

Integrated Detection and Tracking in Multistatic Sonar

Deployment and Testing of Optimized Autonomous and Connected Vehicle Trajectories at a Closed- Course Signalized Intersection

NET SENTRIC SURVEILLANCE BAA Questions and Answers 2 April 2007

LAIR Publications Supported by ARL CTA on ADA. These can be downloaded from:

GUIDED WEAPONS RADAR TESTING

A Miniaturized Identification System for the Dismounted Warrior

Military Applications for Wireless Sensor Networks

Multi-Robot Coordination. Chapter 11

UNIT VI. Current approaches to programming are classified as into two major categories:

International Humanitarian Law and New Weapon Technologies

Mesh Networks. unprecedented coverage, throughput, flexibility and cost efficiency. Decentralized, self-forming, self-healing networks that achieve

A NEW SIMULATION FRAMEWORK OF OPERATIONAL EFFECTIVENESS ANALYSIS FOR UNMANNED GROUND VEHICLE

Wide Area Wireless Networked Navigators

CS594, Section 30682:

INTRODUCTION. of value of the variable being measured. The term sensor some. times is used instead of the term detector, primary element or

Wide-area Motion Imagery for Multi-INT Situational Awareness

Use of Communications EW in a Network Centric Warfare Environment

Virtual Reality Devices in C2 Systems

International Journal of Scientific & Engineering Research, Volume 7, Issue 2, February ISSN

Knowledge Management for Command and Control

Path Planning for Mobile Robots Based on Hybrid Architecture Platform

UNCLASSIFIED R-1 ITEM NOMENCLATURE FY 2013 OCO

NEXTMAP. P-Band. Airborne Radar Imaging Technology. Key Benefits & Features INTERMAP.COM. Answers Now

Analysis of Computer IoT technology in Multiple Fields

Fire Fighter Location Tracking & Status Monitoring Performance Requirements

The Army s Future Tactical UAS Technology Demonstrator Program

Don t shoot until you see the whites of their eyes. Combat Policies for Unmanned Systems

Information Access Challenges: Data Fission Needs of the Field Expert. Dr. Elizabeth Avery Gomez and Joe Chimento, New Jersey Institute of Technology

Chapter 4. Meaconing, Intrusion, Jamming, and Interference Reporting

SST Expert Testimony Common Questions and Answers

Fast and efficient randomized flooding on lattice sensor networks

DESIGN AND CAPABILITIES OF AN ENHANCED NAVAL MINE WARFARE SIMULATION FRAMEWORK. Timothy E. Floore George H. Gilman

Comments of Shared Spectrum Company

ACHIEVING SEMI-AUTONOMOUS ROBOTIC BEHAVIORS USING THE SOAR COGNITIVE ARCHITECTURE

FAQ WHAT ARE THE MOST NOTICEABLE DIFFERENCES FROM TOAW III?

Improving the Safety and Efficiency of Roadway Maintenance Phase II: Developing a Vision Guidance System for the Robotic Roadway Message Painter

AN AUTONOMOUS SIMULATION BASED SYSTEM FOR ROBOTIC SERVICES IN PARTIALLY KNOWN ENVIRONMENTS

Technical Requirements for Fixed Radio Systems Operating in the Bands GHz and GHz

Application of Object Petri Net in the Modeling and Evaluation of Information Superiority

RESEARCH ON METHODS FOR ANALYZING AND PROCESSING SIGNALS USED BY INTERCEPTION SYSTEMS WITH SPECIAL APPLICATIONS

UTILIZATION OF AN IEEE 1588 TIMING REFERENCE SOURCE IN THE inet RF TRANSCEIVER

UNCLASSIFIED. UNCLASSIFIED Office of Secretary Of Defense Page 1 of 5 R-1 Line #102

Next Generation Light/Medium Main Battle Tank Upgrade Solutions.

Appendix D Warning System Guidelines. Draft

Voice Guided Military Robot for Defence Application

Academic Year

Autonomous Mobile Robot Design. Dr. Kostas Alexis (CSE)

So Near and Yet So Far: Distance-Bounding Attacks in Wireless Networks

Low-frequency signals detection and identification as a key point of software for surveillance and security applications

TACTICAL SINGLE-CHANNEL RADIO COMMUNICATIONS TECHNIQUES

Reprint (R43) Polarmetric and Hyperspectral Imaging for Detection of Camouflaged Objects. Gooch & Housego. June 2009

Covert Tunnel Detection Technologies

Gaussian Acoustic Classifier for the Launch of Three Weapon Systems

Cognitive Radio: Smart Use of Radio Spectrum

Jager UAVs to Locate GPS Interference

MULTI-LAYERED HYBRID ARCHITECTURE TO SOLVE COMPLEX TASKS OF AN AUTONOMOUS MOBILE ROBOT

2006 CCRTS THE STATE OF THE ART AND THE STATE OF THE PRACTICE. Network on Target: Remotely Configured Adaptive Tactical Networks. C2 Experimentation

The LVCx Framework. The LVCx Framework An Advanced Framework for Live, Virtual and Constructive Experimentation

Silent Sentry. Lockheed Martin Mission Systems. Jonathan Baniak Dr. Gregory Baker Ann Marie Cunningham Lorraine Martin.

Problems with the INM: Part 2 Atmospheric Attenuation

Unmanned Ground Military and Construction Systems Technology Gaps Exploration

An Adaptive Indoor Positioning Algorithm for ZigBee WSN

Applying Multisensor Information Fusion Technology to Develop an UAV Aircraft with Collision Avoidance Model

April 10, Develop and demonstrate technologies needed to remotely detect the early stages of a proliferant nation=s nuclear weapons program.

Volume 4, Number 2 Government and Defense September 2011

Reliable Discrimination of High Explosive and Chemical / Biological Artillery Using Acoustic Sensors

TIME- OPTIMAL CONVERGECAST IN SENSOR NETWORKS WITH MULTIPLE CHANNELS

Engaging with DARPA. Dr. Stefanie Tompkins. June 9, Distribution Statement A (Approved for Public Release, Distribution Unlimited)

Transcription:

Prospects for Dynamic ISR Tasking and Interpretation Based on Standing Orders to Sensor Networks Aleksandar Pantaleev, John R. Josephson Laboratory for Artificial Intelligence Research Computer Science & Engineering Department The Ohio State University {pantalee jj}@cse.ohio-state.edu ABSTRACT This research is intended to contribute to the development of automated and human-in-the-loop systems for higher level fusion to respond to the information requirements of command decision making. In tactical situations with short time constraints, the analysis of information requirements may take place in advance for certain classes of problems, and provided to commanders and their staff as part of the control and communications systems that come with sensor networks. In particular, it may be possible that certain standing orders can assume the role of Priority Intelligence Requirements. Standing orders to a sensor network are analogous to standing orders to Soldiers. Trained Soldiers presumably don't need to be told to report contact with hostiles, for example, or to report any sighting of civilians with weapons. Such standing orders define design goals and engineering requirements for sensor networks and their control and inference systems. Since such standing orders can be defined in advance for a class of situations, they minimize the need for situation-specific human analysis. Thus, standing orders should be able to drive automatic control of some network functions, automated fusion of sensor reports, and automated dissemination of fused information. We define example standing orders, and outline an algorithm for responding to one of them based on our experience in the field of multisensor fusion. Keywords: standing orders, PIRs, automated fusion, sensor networks, multisensor fusion 1. INTRODUCTION The theme of this report is how the analysis of information requirements can be used to enable automated fusion for behavior recognition, threat warning, and other aspects of higher-level fusion commonly associated with JDL Levels 2 and 3. In particular, we consider Commander's Priority Intelligence Requirements (PIRs). PIRs are advantageous to study because they are well documented, and because they are, after all, Priority intelligence requirements. According to recent doctrine (e.g., US Army Field Manual FM-2-0), PIRs are designated by the commander, and specify the information elements, about the enemy or environment, that are required by the commander as an anticipated and stated priority in his task of planning and decision making. A PIR is associated with the need to make a specific decision that will affect mission accomplishment. Intelligence, Surveillance, and Reconnaissance (ISR) personnel analyze the PIRs to determine Indicators, which are potentially observable states or activities that would provide evidence for possible answers to PIRs. Indicators may be broken down hierarchically into other Indicators. These indicators are then analyzed, typically with significant input from terrain analysis, to determine Specific Information Requirements (SIRs). Then a collection plan is generated, tailoring the gathering of information to specific units, personnel, and ISR systems. This gives rise to Specific Orders & Requests (SORs), which are then disseminated. This breakdown is illustrated in Figure 1, which also shows information flowing upward through the hierarchy as information captured during the ISR analysis process is used to fuse information automatically at run time.

Figure 1: Analysis of PIRs, ISR Tasking, and Fusion. Note that none of this analysis or planning is now automated, except for very narrow point solutions for very narrow special cases. As far as we are aware, fully automated ISR requirements analysis and collection planning is not thought to be possible, or even desirable, in the foreseeable future. However, computing and communications can be used to record the intermediate and final products of the ISR requirements analysis and collection planning processes, and to disseminate the results by communicating orders and requests. They can help keep track of status, and help with optimizing the collection plan. Collection planning typically occurs in the face of limited ISR resources and a lot that a commander would like to know, with tradeoffs among the potential importance of information and several dimensions of projected costs of acquiring items of information, including risks to personnel and equipment, and risks of revealing what is being looked for to the enemy. In related work, researchers at our laboratory at Ohio State have been investigating interactive decision support for planning, and have recently begun work on ISR asset allocation as a specimen domain for investigating modes of computer assistance for multicriterial planning. This report has two main points to make. The first is: Hypothesis 1 - Knowledge of the significance of desired information, elicited during the analysis and planning processes, can be captured in the computer, and used to automate important elements of higher-level fusion. The vision is that, among large amounts of fused information (JDL level 1) incoming from sensors and human reports, those items can be extracted from the stream that are relevant to predefined higher-level concerns, and their evidential impacts on those concerns can be assessed automatically, so that the situation awareness of commanders and ISR personnel can be enhanced by presentations of incoming information that has been automatically abstracted and interpreted to address the status of those higher-level concerns. Information captured from analysis and planning, will

enable automated fusion to determine, for certain incoming messages, why we wanted to know, and what is the evidential significance for responding to the questions posed by the PIRs. 2. HIGHER-LEVEL FUSION FOR TACTICAL SENSOR NETWORKS At lower echelons, especially in tactical situations with short time constants, only an abbreviated process of ISR analysis and collection planning will take place, and it will probably not be done using a computer. How then can information be captured from analysis and planning and used to define relevance for automated fusion? One possibility is for the analysis of information requirements to take place in advance for certain classes of problems, and provided to commanders and their staffs as part of the control and communications systems that come with sensor networks. In particular, it may be possible that certain standing orders can assume the role of the PIRs in Figure 1. This leads to the second main point to be made in this report: Hypothesis 2 - Standing orders to sensor networks can specify information requirements in advance of deployment that can be used to define design goals and engineering requirements for sensor networks and for their control and inference systems. Significant elements of information fusion for support of tactical decision making can be achieved by defining a basic set of such standing orders, and by designing sensor networks to satisfy them. Standing orders to a sensor network are analogous to standing orders to soldiers. Trained soldiers presumably do not need to be told to report contact with hostiles, for example, or to report sightings of civilians with weapons. Since such standing orders can be defined in advance for a class of situations, they minimize the need for situation-specific human analysis. Thus, standing orders should be able to drive automatic control of some network functions, automated fusion of sensor reports, and automated dissemination of fused information. For example, sensor systems should normally: (1) Report small arms fire and other explosions (unless they are clearly produced by friendlies). The location of any explosion should be estimated as accurately as possible, as well as the trajectories of any fires. If any friendly unit is near to, or approximately pointed at, by any trajectory, that is, if any friendly unit is plausibly the target of such fires, that unit should be notified immediately, and provided with the estimated location of the source. For convenience, we will refer to this proposed standing order as bang detection. (2) Report humans or vehicles that are not known to be friendly approaching any friendly unit. (Some default threshold range would be set, depending on the terrain and echelon.) The unit being approached should be a recipient of such a report. Many types of sensors might be used. (3) Report activity of humans or vehicles adjacent to any building where friendly units are located or adjacent to. For example, as a patrol approaches the front of a building, any people entering or leaving the back of the building should be reported to the patrol unit, as should any people coming around the building toward the patrol. (4) Report the presence or release of hazardous nuclear, chemical, or biological materials. Such materials should be classified as specifically as possible. Probable dispersion should be projected based on the characteristics of a material, direction and speed of the wind, and other meteorological conditions. Warnings should be issued to any friendly units in the path of such dispersion, or near to the path, including information about the materials detected, probable time to exposure, and appropriate protective measures or other responses (e.g., put on hazmat suit, take antidote, etc.) These are intended as a tentative and representative set of such standing orders, not an exhaustive set. Suggestions for additions or improvements are welcome. In the case of standing order (1) bang detection we assume that ISR systems will be able to track friendlies, because they will have GPS-enabled communication units reporting their position to a central location. The main sensors needed for this job would be acoustic, although infrared imaging might also make a contribution. Presumably infantry soldiers will have light microphone arrays as part of their battle gear, military vehicles will have acoustic sensors, and microphone sensor arrays will also be positioned at fixed locations throughout the area of interest. All microphone arrays, together with a central fusion node, form a semi-mobile acoustic sensor network. To ameliorate communication

and bandwidth issues, we suppose that a microphone array would be largely autonomous. Only a close match with a predetermined sound pattern (explosion, gunfire) with an amplitude above a certain threshold would trigger sending information to the central location. A central fusion node, probably located at a command post, would be capable of pinpointing the location of sources of pre-determined sound patterns, as well as the trajectories of projectiles within the area. The system would determine the locations of explosions and the trajectories of projectiles, cross-index these locations with the locations of friendly units, and notify the relevant friendlies. The locations and types of explosive events, and the trajectories of fines, would also appear on displays at the command post. To support standing order (2) reporting approaching vehicles a variety of types of sensors might be used, with their results being fused to meet the requirements. Appropriate sensors would include acoustic sensors similar to those described previously for standing order (1). Other relevant sensor types include tripwires, infrared detectors, imaging sensors, magnetometers, seismometers, and airborne GMTI. All sensors in a particular area of responsibility form a sensor network. A central fusion node would receive messages from all of the sensors, and would track vehicles as one of its responsibilities. To support standing order (3) to report activity around adjacent buildings sensors would need to be capable of detecting and tracking individual humans. While this is difficult for current technology, progress appears to be occurring, and high sensor density can, in principle, provide redundancy that can be exploited to overcome the limitations of particular sensor types and detection algorithms. To support standing order (4) providing warning of NBC threats many new types of sensors will need to be developed, especially as new types of biological threats emerge. Nevertheless, sensor for nuclear radiation are quite mature, and sensors for chemical and biological threats are the subject of significant current efforts. Support for all of the standing orders described here will require that information about the local terrain be supplied to fusion algorithms, so that this information is available to constrain hypotheses about the movements of entities and about the propagation of signals from entities and events to the sensors that are used for detection and interpretation. To illustrate how standing orders can lead to information requirements, and how these information requirements lead to designs, we now consider bang detection in greater detail. 3.1. Design sketch 3. BANG DETECTION For the purposes of bang detection (1), the sensor network might plausibly consist of self-contained units and a central node. Every unit might be equipped with a GPS receiver, an acoustic array, a wireless transceiver, and a digital clock (set from the GPS signal) capable of measuring time with microsecond precision. Every unit would have a unique wireless address, which distinguishes it from other units for routing purposes. A unit would be capable of performing limited processing, constrained mainly by its power source. Units of this kind would be carried by all friendlies, put on friendly vehicles, and it is conceivable that the area of interest would have been prepared by positioning stationary units at predefined locations, or by scattering them about. The central fusion node would plausibly be located at the command post, and be responsible for a particular area of interest. The central node would make use of a model of the terrain, including any urban structures or features. Whenever an acoustic array unit detects a sound with an amplitude above a given threshold, it would compare the sound against a list of pre-determined sound patterns for explosions and small-arms fire. The list would include sound patterns for muzzle explosions as well as cracks arriving from supersonic projectiles. If the sound received matches sufficiently well with a pre-stored pattern, the unit would send a message containing a short report of its encounter to the central node. The message would contain the following information: the code of the pattern encountered,

its average amplitude, the direction of arrival, along with a margin of error, the time of arrival, the location of the unit. Preparing the sensor network would require assigning unique codes to the important sound patterns that might be encountered, loading all sensor units with the codes and patterns, and providing the central node with the codes. 3.2. Establishing array orientation A possible issue with providing the exact angle of incidence of a sound is the inherent difficulty in establishing a mobile acoustic array's orientation at a given point of time. That is, an acoustic array readily provides directions of sounds, but those directions are matched against the microphone array's own orientation axis. It is not trivial to estimate the direction of such an axis. In this discussion we provide two possible solutions to this issue, but we do not elaborate on them. Rather, we assume that this problem will be solved, and the message a sensor unit sends would indeed contain the direction of a sound with reasonable accuracy (possible error of up to, say, eight degrees). One way to solve the orientation problem might be to have more than one GPS receiver on every sensor unit. If differential GPS is used with the help of a nearby fixed-location unit, then positional information with centimeter precision could be obtained. By locating several GPS receivers on known physical points within a sensor unit, it is possible that it could triangulate the direction of the orientation axis of its microphone array from the information its GPS receivers provide. The accuracy of this method depends mainly on the size of a sensor unit, as the GPS receivers have to be positioned some distance from each other. It is certainly feasible to have sensor units of this kind on vehicles, with a possible error in triangulation of up to eight degrees; precise calculations need to be made, however, to estimate the feasibility of including a similar sensor unit as a part of a soldier's battle gear. Another way to solve the orientation problem would be to use more than one sensor unit for triangulation. A sensor unit might be capable of emitting sound waves from its microphones, and another sensor unit would be capable of perceiving those sound waves and relating their angle of incidence to its orientation axis. Triangulation would depend on the receiving unit knowing the location of the GPS of the transmitting one; for that purpose, the sensor unit that emits a sound signal might simultaneously transmit a wireless message containing its GPS location. This method does not have the physical size constraints of the previous one, as the sensor units participating in the exchange can be some distance from each other. The feasibility of this method depends on the sensor units being able to reliably perceive the sounds emitted by other sensor units, which might be a problem in a battlefield situation, even though the sounds need not be in the human hearing range. This method would require a good protocol, developed specifically for the purpose, for communicating requests for sound emissions, and processing the responses considering the GPS locations of sound emissions. It would also need a good algorithm for matching received sounds with received wireless messages containing originating locations of sounds. 3.3. Fusion algorithm Once such a bang report reaches the central node, it would be unpacked and the information it carries is input to the analysis of the developing battle scene. The central node would need to be capable of fusing that information to infer the plausible locations of explosions and gunfire, as well as the trajectories of projectiles. The required fusion algorithm might work as follows: a. Estimate a plausible general location for the sound source based on the information contained within a single report. The necessary information for this estimation includes the location of the sensor node, the direction of the sound, and its amplitude. It can be assumed that the amplitude of a received sound would be enough to provide some measure of the distance the sound traveled, if the sound pattern is well-known (presumably all AK47 assault rifles produce sounds of similar pattern and amplitude, for example). This distance measure could be expected to have low accuracy, and might generally be constrained to discriminate only close, mid-range, and far away.

b. Estimate the time that a sound was emitted based on its estimated location, the distance the sound traveled to the sensor node, and the time it was received. c. Decide whether a subsequently reported sound source might be the same as a previously reported one, and associate incoming messages to sound sources on the basis of the information processed from previous messages. All plausible combinations of sounds of the same kind might be considered, each one amounting to a hypothesis specifying the time and location of a kind of explosion. Each such hypothesis, and its more precise sub-hypotheses, would be assigned a measure of plausibility based on: how many bang reports from individual sensor units that hypothesis can explain; how well the hypothesis explains these reports considering distances, arrival times, and arrival angles; and how many bang reports should have been reported by nearby sensor units but were not. If the plausibility of a hypothesis does not exceed some threshold, it would be rejected, otherwise it would be considered to be plausible. d. Decide whether a hypothesized explosion has sufficient evidence to be accepted, i.e., considered to be true. This might work as follows. After sufficient time for the sound from a hypothesized explosion to have reached several sensors: If only one plausible hypothesis is available to explain a report, that hypothesis would be automatically accepted, and all of the reports that it explains would be marked as explained. If more than one plausible hypothesis is available to explain an unexplained report, but one of these hypotheses is more plausible than the others considering its score from step c, then that hypothesis would be automatically accepted if the difference in plausibility score between the most plausible and its nearest rival exceeds a present threshold T. Decreasing the threshold T would increase the sensitivity, moving it along the ROC curve toward increasing false positives and decreasing false negatives. If a hypothesis is accepted, all of the reports that it explains would be marked as explained. This inference method is basically a form of inference to the best explanation or abductive inference. In particular it is a variant of the processing strategy given in Josephson & Josephson (1996, p. 208 ff.). e. Sound reports matching cracks emitted from supersonic projectiles would be handled somewhat differently. The possible projectile trajectories would be estimated from multiple reports, including reports corresponding to the muzzle explosions. Devising an algorithm for generating plausible trajectories that does not generate an overwhelming number of such hypotheses is a challenging problem, which will not be explored in this report. Nevertheless, it should be possible to derive such an algorithm, at least for relatively benign cases, based on close measurements of arrival time, and considering that cracks from supersonic projectiles are heard only at close range. 3.4. Echo reversal using terrain information The algorithm just described assumes straight-line, shortest-distance paths of sound propagation, which may not be the case, especially in urban terrain. This algorithm can presumably be enhanced if an accurate three-dimensional model of the terrain in the area of interest is provided in advance to the central node, which would use this terrain model to improve the reliability and locational accuracy of accepted explosion and trajectory hypotheses. It would do so by modeling the possible non-straight propagation paths. Reflections might be handled by reversing the path of arrival until it intersects a reflective surface represented in the terrain model. Propagation paths might also be hypothesized that follow shortest paths around obstacles, thus accounting for sound diffraction as well as reflection. Walls in the model would count against hypotheses for projectile trajectories that pass through such walls. If the terrain is urban and therefore reflections, or echoes, are plentiful, then a single sound emission might arrive at a single sensor unit via more than one path, and consequently the sensor unit might send more than one report for a single sound emission. By reversing the arrival paths, these multiple reports would provide multiple sources of information for estimating the location of the source, allowing for improved accuracy, albeit at the cost of substantially increased computational load.

3.5 Top-down control The central fusion node could be designed to send control information to the sensor units. This control information might include waking a sufficient number of units to maintain vigilance, while instructing remaining units to remain sleeping to conserve battery power. Units could be rapidly awakened to improve accuracy in response to a firefight. Control might also include sending doubt signals to sensor units that submit reports using classification codes that are not corroborated by other reports. A unit receiving such a signal would reevaluate the corresponding sound pattern, and provide a different, second-best matching code for the pattern. To support this, each sensor unit would have to maintain some amount of short-term memory. Another use of top-down control might be to increase the detection sensitivity of relatively isolated units, to extend their effective ranges. 4. SUMMARY AND CONCLUSION We have outlined how the analysis of information requirements can enable automated fusion for behavior recognition, threat warning, and other aspects of higher-level fusion needed to support military decision making. At higher echelons, where the analysis of information requirements, and the tasking of ISR assets, is done explicitly, and where computers and communication resources will presumably be used as a matter of course, the knowledge needed for automated fusion can be captured as a side effect of the analysis and tasking. At lower echelons, standing orders to sensor networks can be used to drive the capture, in advance of deployment, of at least portions of the needed knowledge. We have also sketched a design for sensor networks to support bang detection, a plausible example of such a standing order. Considering the current state of technology for sensors, communications, and algorithms for pattern recognition and entity tracking, and considering the rapid progress that can be expected to result from the substantial current investment in research and development in this area for both military and civilian applications, it is quite plausible that sensor networks of the near future will be capable of supporting the types of standing orders we have described. We suggest that progress will be much more rapid if an initial set of standing orders can be agreed upon, and used to focus investment in R&D and design of sensors and sensor networks. ACKNOWLEDGEMENTS This research was supported through participation in the Advanced Decision Architectures Collaborative Technology Alliance (CTA) sponsored by the U.S. Army Research Laboratory under Cooperative Agreement DAAD19-01-2-0009. The views and conclusions contained in this document are those of the authors and should not be interpreted as representing the official policies, either expressed or implied, of the Army Research Laboratory, Defense Department, or the U. S. Government. REFERENCES 1. Headquarters, Department of the Army (2004), Intelligence, US Army Field Manual No. 2-0. 2. Josephson, J. R., & Josephson, S. G. (Eds.). (1994, 1996). Abductive Inference: Computation, Philosophy, Technology. New York: Cambridge University Press. 3. Josephson, J. R., B. Chandrasekaran, Mark Carroll, "Toward A Generic Architecture For Multisource Information Fusion," Proceedings of US Army Research Laboratories Collaborative Technology Alliances Symposium, April 29May 1, 2003, University of Maryland Conference Center, College Park, MD. 4. Pantaleev, A. and Josephson, J. (2006) Higher-level fusion for military operations based on abductive inference: proof of principle, Proceedings of the Conference on Multisensor, Multisource Information Fusion: Architectures,

Algorithms, and Applications, part of The International Society for Optical Engineering (SPIE) Defense and Security Symposium, Orlando, FL.