Synthetic vision system for improving unmanned aerial vehicle operator situation awareness

Size: px
Start display at page:

Download "Synthetic vision system for improving unmanned aerial vehicle operator situation awareness"

Transcription

1 Proc. SPIE Vol p , Enhanced and Synthetic Vision 2005, Jacques G. Verly, Ed. May (Clearance number: AFRL/WS , 23 February 2005). Synthetic vision system for improving unmanned aerial vehicle operator situation awareness Gloria L. Calhoun* a, Mark H. Draper a, Mike F. Abernathy b, Frank Delgado c, and Michael Patzek a a Air Force Research Laboratory/HECI, 2210 Eighth St., Bldg. 146, Rm. 122, WPAFB, OH b Rapid Imaging Software, Inc., 1318 Ridgecrest Place SE, Albuquerque, NM c Mail Code ER2, Bd. 32, Rm 227, NASA/JSC2101 NASA Parkway, Houston, TX ABSTRACT The Air Force Research Laboratory s Human Effectiveness Directorate (AFRL/HE) supports research addressing human factors associated with Unmanned Aerial Vehicle (UAV) operator control stations. Recent research, in collaboration with Rapid Imaging Software, Inc., has focused on determining the value of combining synthetic vision data with live camera video presented on a UAV control station display. Information is constructed from databases (e.g., terrain, cultural features, pre-mission plan, etc.), as well as numerous information updates via networked communication with other sources (e.g., weather, intel). This information is overlaid conformal, in real time, onto the dynamic camera video image display presented to operators. Synthetic vision overlay technology is expected to improve operator situation awareness by highlighting key spatial information elements of interest directly onto the video image, such as threat locations, expected locations of targets, landmarks, emergency airfields, etc. Also, it may help maintain an operator s situation awareness during periods of video datalink degradation/dropout and when operating in conditions of poor visibility. Additionally, this technology may serve as an intuitive means of distributed communications between geographically separated users. This paper discusses the tailoring of synthetic overlay technology for several UAV applications. Pertinent human factors issues are detailed, as well as the usability, simulation, and flight test evaluations required to determine how best to combine synthetic visual data with live camera video presented on a ground control station display and validate that a synthetic vision system is beneficial for UAV applications. Keywords: synthetic vision, conformal overlay, situation awareness, unmanned aerial vehicle, UAV 1. OVERVIEW Unmanned Aerial Vehicles (UAVs) are aircraft without the onboard presence of a pilot or crew. Though the physical separation of the crew from the aircraft offers many promising benefits, it also presents challenges to the effective design of the UAV control station. Numerous human factors issues such as system time delays, poor crew coordination, high workload, and reduced situational awareness may negatively affect mission performance 1. When onboard an aircraft, a pilot and crew receive a rich supply of multi-sensory information instantaneously regarding their surrounding environment. UAV operators, however, may be limited to a time-delayed, reduced stream of sensory feedback delivered almost exclusively through the visual channel. Of all the information displays within military UAV control stations, the video imagery from various cameras mounted on the UAV is particularly valuable. UAV pilots use imagery from the nose and gimbal cameras to verify clear path for taxi/runway operations, scan for other air traffic in the area, and identify navigational landmarks and potential obstructions. Additionally, sensor operators use imagery from a gimbal-mounted camera to conduct a wide variety of intelligence, surveillance and reconnaissance activities as well as to directly support combat operations. However, video imagery quality can be compromised by narrow camera field-of-view, datalink degradations, poor environmental conditions (e.g., dawn/dusk/night, adverse weather, variable clouds), bandwidth limitations, or a highly cluttered visual scene (e.g., in urban areas or mountainous terrain). If imagery interpretation could be enhanced and made more robust under a wide variety of situations, UAV mission effectiveness is expected to increase substantially. Synthetic vision systems can potentially ameliorate negative video characteristics and enhance UAV operator interpretation of the imagery. Spatially-relevant information is constructed from databases (e.g., terrain, cultural features, maps, etc.) as well as numerous real-time information updates via networked communication with *Gloria.calhoun@wpafb.af.mil; phone ; fax ; web site:

2 Report Documentation Page Form Approved OMB No Public reporting burden for the collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of this collection of information, including suggestions for reducing this burden, to Washington Headquarters Services, Directorate for Information Operations and Reports, 1215 Jefferson Davis Highway, Suite 1204, Arlington VA Respondents should be aware that notwithstanding any other provision of law, no person shall be subject to a penalty for failing to comply with a collection of information if it does not display a currently valid OMB control number. 1. REPORT DATE MAY REPORT TYPE 3. DATES COVERED - 4. TITLE AND SUBTITLE Synthetic vision system for improving unmanned aerial vehicleoperator situation awareness 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) AFRL/HE,2255 H Street,Wright Patterson AFB,OH, PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS(ES) 10. SPONSOR/MONITOR S ACRONYM(S) 12. DISTRIBUTION/AVAILABILITY STATEMENT Approved for public release; distribution unlimited 13. SUPPLEMENTARY NOTES The original document contains color images. 11. SPONSOR/MONITOR S REPORT NUMBER(S) 14. ABSTRACT The Air Force Research Laboratory s Human Effectiveness Directorate (AFRL/HE) supports research addressing human factors associated with Unmanned Aerial Vehicle (UAV) operator control stations. Recent research, in collaboration with Rapid Imaging Software, Inc., has focused on determining the value of combining synthetic vision data with live camera video presented on a UAV control station display. Information is constructed from databases (e.g., terrain, cultural features, pre-mission plan, etc.), as well as numerous information updates via networked communication with other sources (e.g., weather, intel). This information is overlaid conformal, in real time, onto the dynamic camera video image display presented to operators. Synthetic vision overlay technology is expected to improve operator situation awareness by highlighting key spatial information elements of interest directly onto the video image, such as threat locations, expected locations of targets, landmarks, emergency airfields, etc. Also, it may help maintain an operator s situation awareness during periods of video datalink degradation/dropout and when operating in conditions of poor visibility. Additionally, this technology may serve as an intuitive means of distributed communications between geographically separated users. This paper discusses the tailoring of synthetic overlay technology for several UAV applications. Pertinent human factors issues are detailed, as well as the usability, simulation, and flight test evaluations required to determine how best to combine synthetic visual data with live camera video presented on a ground control station display and validate that a synthetic vision system is beneficial for UAV applications. 15. SUBJECT TERMS 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT a. REPORT unclassified b. ABSTRACT unclassified c. THIS PAGE unclassified 18. NUMBER OF PAGES 12 19a. NAME OF RESPONSIBLE PERSON

3 other sources (e.g. intelligence assets, C2 sources, etc.) and overlaid conformal onto the dynamic camera image display. These computer-generated overlays appear to co-exist with real objects in the imagery, highlighting those points and regions of interest to operators. Those familiar with virtual reality technology will know of this concept as augmented reality 2. This synthetic vision overlay is hypothesized to have many benefits. It may improve operator situation awareness by highlighting information elements of interest on the camera image, such as threat location, the expected location of targets, landmarks, emergency airfields, and position of friendly forces. Secondly, it may maintain the operator s situation awareness of an environment if the video datalink is temporarily degraded or lost. Synthetic vision systems can also serve to facilitate intuitive networked communications between geographically separated users. One concept is to have on command representation of friendly, neutral, and hostile forces using synthetic overlays, allowing the UAV operator to look around and see those around him/her. Friendly forces, networked in some manner, could share information on their past and present positions, as well as planned paths and possibly their action points, facilitating team interaction. The friendly forces could also pool their knowledge of neutral and hostile forces to help maintain battlespace awareness. Conceptually, the synthetic vision system can display things that cannot normally be seen. For example, perhaps the state of a system can be portrayed based on its emissions (e.g., radar), or machine-to-machine communications (e.g., data link activity) can be highlighted when data is being sent/received. By representing the activities and states, the operator may be able to gain additional situation awareness about the surrounding systems. This paper describes an ongoing collaboration between Rapid Imaging Software Inc. and the Air Force Research Laboratory s (AFRL) Warfighter Interface Division in tailoring and evaluating a synthetic vision system for UAV applications. Related human factors issues will be delineated and discussed. An overview will then be provided of an AFRL research program that is evaluating the benefits of synthetic vision technology for UAV applications and developing human factors guidelines associated with this technology. 2. COLLABORATION ON CANDIDATE UAV SYNTHETIC VISION SYSTEM The Air Force Research Laboratory s Warfighter Interface Division has engaged in an undertaking to design and evaluate the military utility of conformal interactive synthetic vision overlay concepts tailored to UAV operations. Using detailed knowledge of current Air Force UAV operations along with established human factors design practices, several initial interface concepts have been generated and supporting hardware architecture developed. In parallel, Rapid Imaging Software, Inc., under a NASA research contract, has developed a synthetic vision product called the SmartCam3D System (SCS) to improve the situation awareness of NASA UAV operators. The SCS has been engineered to be tested in operational UAV environments and has been evaluated by operators during multiple flights of the NASA X-38 UAV and the Army Shadow UAV. The present collaboration brings these two resources together to design and evaluate tailored synthetic visual overlays for various Air Force UAV and C2 applications, including those involving teleoperated and small UAV applications. 2.1 SmartCam3D Synthetic Vision System The SmartCam3D (SCS) is an enhanced visualization technology developed by Rapid Imaging Software Inc. as part of a NASA X-38 RPV flight-test effort. Subsequently, it was matured during an integration effort for an Army UAV program. For the present effort, it has been tailored for Air Force UAV applications through collaboration with the AFRL. This system combines real-time synthetic vision with live video, in an attempt to enhance the situation awareness of UAV operators across a wide range of missions and environmental conditions (Figure 1). This technology provides the users with real-time video that is enriched with conformal spatially-relevant scene information from multiple sources (database, mission plan, real-time intel updates, etc.). The goal is to effectively increase the signal-tonoise ratio of the imagery, allowing operators to more quickly locate, identify, and act on critical information. The SCS consists of three physical components: the UAV platform, a camera which produces a video image stream, and a computer (PC or laptop) with the SCS software to create a synthetic vision which matches the camera (Figure 2). The SCS computer is stationed in the control station and has a geographic database that the software uses to create the synthetic view. Additional network feeds are needed to provide SCS with real-time intelligence and C2 updates. The notion of SCS is simple. A video camera is mounted on the aircraft in such a manner as to provide the

4 operator with a view from the vehicle. At the same time, a computer creates a three-dimensional representation of the current scene that the camera should be viewing. Doing this requires a camera bore-sight calibration procedure to coalign the real and simulated cameras. This done, the two streams of video are overlaid inside of the computer. Figure 1. SmartCam3D (SCS) display illustrating spatially referenced computer-generated overlay symbology onto real-time video imagery. Video Stream (any wavelength will suffice) Camera position and orientation data Real-time Vehicle Position and Attitude Data (Latitude, Longitude, Altitude, Pitch, Heading, Roll) Computer combines the video stream with the synthetically generated scene. The computer will also insert any additional overlays / information that may be necessary. Output stream containing camera video combined with real-time synthetic vision imagery. HUD and Miscellaneous Overlays GIS Terrain Database Figure 2. Basic components of the SmartCam3D System (SCS).

5 Another critical component for the SCS operation is a real-time video stream and position/attitude data for the aircraft. The specific data that is required includes: latitude, longitude, altitude, pitch, heading and roll. Additionally, if the camera is mounted on a gimbal and has zoom capability, then gimbal angles and zoom settings are also required in the data stream. Realistic performance requires high data rates (i.e., greater than once per second). This data allows the software to synthesize a real-time synthetic scene to match the camera s view. Symbology on the SCS is based on information inserted by the computer that details the location of landing zones, no-fly zones, runways, obstructions, buildings, topography and other geographic data. Anything with known geographic coordinates can be included in the scene. Since the synthetic vision system is based on the VisualFlight software ( which is already compatible with most NIMA (National Imagery & Mapping Agency) data formats (DTED, ADRG, CADRG, CIB, etc.), the necessary geographic data is readily available. This synthetic scene is overlaid on the video in real time, and matches the camera view. As the vehicle flies, operators can look at the live video and see the target or landing locations overlaid with the synthetic view. In cases of night, poor weather or other limited visibility environments, the operators can utilize the computer-generated synthetic camera imagery (essentially re-creating modeled components of the real world scene via computer graphics). However, if video imagery is available, it can provide a view that includes transient objects that are not present in the geographic database. Obstructions and hazardous areas can be clearly marked, as can important landmarks and desired landing points. Furthermore, because the synthetically created objects are generated from digital data, they are not subject to the limitations of visibility inherent to video. While darkness, terrain occlusion, smoke, fog, icing, and haze all impact the video, the synthetically generated scene remains unobstructed. 2.2 Example Synthetic Vision Overlay Interface Concepts for UAV Applications Candidate interface concepts were generated for teleoperated UAV applications (examples are depicted in Figures 3-6). These concepts are a result of the AFRL/Rapid Imaging, Inc. collaboration effort, along with the results of a usability analysis conducted with UAV operators (see section 4.1). Validation of these concepts in high fidelity simulation and flight tests is underway (see sections 4.2 and 4.3). Even though this validation process has not been completed, the display concepts are introduced here to make the follow on discussion of human factors issues more relevant. Figure 3. Synthetic vision symbology added to simulated UAV gimbal video imagery (symbology marking threat, landmarks, areas of interest and runway).

6 Figure 4. Synthetic vision symbology added to simulated UAV nose camera video imagery (symbology marking threats and planned vehicle pathway). Figure 5. Synthetic vision symbology for improving situation awareness in cluttered urban environments.

7 Figure 6. Picture-in-picture concept, with real video imagery surrounded by synthetic-generated terrain imagery. Affords virtual expansion of the available sensor field-of-view well beyond the physical limits of the camera. 3. HUMAN FACTORS ISSUES WITH UAV SYNTHETIC VISION SYSTEMS Synthetic vision systems consists of computer-generated worlds created solely from various models and databases. Presenting computer based information in a conformal manner with sensor imagery on a unified display has been demonstrated in past research to reduce scanning time, reduce the need to mentally integrate spatial information from disparate sources, and facilitate attentional focus and management 3. It is thought that the performance benefit is a result of the synthetic vision system highlighting important information elements. Also, the system can include information that does not have a correlate in the actual sensor imagery, such as threat lethality envelopes. Past research has primarily focused on using synthetic vision systems to help piloting tasks during low-level flights with manned platforms 4. The system can convey self-motion cues and depth cues without occluding the sensor image. Moreover, flight guidance symbology can be provided for reduced visibility conditions, especially during terminal flight operations such as landings. It is anticipated that a synthetic vision system s highlighting would have similar benefits for UAV flight operations. Additionally, ground target search and identification tasks associated with many UAV missions will likely benefit from this technology. However, there are numerous human factors issues (described below) to consider for UAV applications of synthetic vision systems. 3.1 Location of synthetic vision symbology One major design consideration is the location of the synthetic vision symbology: overlaid on the existing camera display, on a head coupled head-mounted display (HMD), or a separate dedicated display on the console. Although head-coupled applications afford the potential to allow full 360 degree viewing of an area, previous simulation research has demonstrated that use of a HMD can be detrimental to many UAV sensor operator tasks 5. The use of separate displays reduces clutter on the sensor imagery, however previous research has shown that the scanning time involved in using a separate display is often more costly than the additional clutter imposed in overlaying synthetic vision onto the existing sensor imagery display 6. Having the information overlaid conformally on the camera display may reduce scan time, minimize division of attention, and may improve information retrieval, but with the potential cost of additional clutter and the possibility of cognitive tunneling (explained further below). This research issue also involves determining how information on the head-up displays should correspond to head-down displays. The concept is to maintain visual momentum as the individual shifts attention from head-down to head-up or vice versa.

8 3.2 Presentation of synthetic vision symbology: scene-linked or not. Assuming that the synthetic vision symbology is to be overlaid on the sensor image rather than presented on a separate display, it needs to be determined how the symbology will be presented simultaneously with the sensor image. Specifically, will the symbology elements just be superimposed over the sensor image or will elements also be scenelinked (referenced to the world) such that they undergo the same visual transformation as real objects depicted in the imagery? An example of the latter is a virtual billboard growing larger as the operator approaches the runway and any pitch or yaw of the aircraft would be perceived incidentally when viewing the display. The choice of presentation method may be symbology element specific. One driving factor is the degree to which the UAV display is anticipated to be cluttered if there are numerous items in short range, it may not be feasible to have them all increase in size as the operator closes in. However, scene-linking symbology to elements in the real image may reinforce other motion cues and benefit information retrieval (see 3.6). 3.3 Optimization of the synthetic vision system symbology Individual synthetic vision symbology elements. For every display element, there are a multitude of related human factors issues. For instance, for each line segment and icon used, what is the ideal shape, color, brightness, contrast, size, thickness, style, etc.? What is the ideal font size, color, background, etc. for any label used? To what degree of detail should the labels provide information? Should the transparency of the symbology be manipulated such that both the video imagery and the synthetic vision symbology are simultaneously visible? Should color and size vary based upon visual conditions? For certain types of symbols, additional design questions arise. With pathway-in-the-sky formats, for example, the appropriate number of pathway segments needs to be determined, along with their spacing. Finally, usability of the candidate symbology elements needs to be evaluated. For this testing, the symbology should be tested with a sensor image that replicates the anticipated background, sensor view, clutter, light level, etc. that will appear in operational applications Terrain overlay. There are a variety of methods that can be employed to portray terrain in a synthetic vision display, including a simple gridded overlay (rectangular grids of known size to facilitate depth perception), terrain texturing (e.g., colors correspond to different absolute terrain elevations), and photo-realistic terrain imagery (i.e., from satellite imagery data). Various maps and other geographically referenced overlays may also be useful depending on the task at hand (e.g., FalconView; Once again, usability of the candidate terrain overlay needs to be evaluated in representative UAV environments Picture-in-picture (PIP) presentation. The symbology elements and terrain overlay mentioned above play a key role in the implementation of the PIP concept. With PIP, the video image is condensed such that it only takes up a portion of the display width so that a synthetic view can be presented surrounding the video image, thereby virtually increasing the overall field-of-view available to the operator. In implementing the PIP concept, several unique design questions arise: Should the location of the sensor imagery be fixed in the center of the display, or should the operator be able to adjust its location? Should the operator be able to pan the field-of-view presented on the PIP? How many size ratios (synthetic scene/camera image) should be made available to the UAV operator and should the operator s size selection be continuous or discrete, the latter involving selection between pre-established ratios? An ongoing simulation evaluation (see section 4.2) is addressing these issues along with an even more fundamental design question: does the PIP concept improve situation awareness and target prosecution? 3.4 Information clutter The ability to provide a synthetic vision system overlaid on a sensor image can have both a positive and negative impact 7,8. Having all the information on one display can minimize scanning and the effort required to access and monitor all the elements. However, the information clutter may inhibit the processing of the fine detail in the sensor imagery because of the inhibitory effects of overlay clutter. Moreover, the capability afforded by synthetic vision overlays to enable operators to see data about objects that are not visible in the real sensor imagery can increase clutter. For example, with this Superman X-ray Vision a threat that is visually occluded behind a mountain might continue to be depicted with an overlaid symbol. However, with this portrayal, the operator can lose occlusion cues, which are important for perceiving depth. Presenting occluded objects with dotted or blurred outlines might help operators track

9 where elements are located in the three-dimensional world. Thus, the design of the synthetic vision symbology needs to take into account the potentially negative effects of information clutter by only including elements that will benefit the operators situation awareness and performance and employing design features that minimize clutter effects and confusion. Regardless of the symbology set, operators should be provided with the capability to declutter the synthetic vision symbology. A declutter function is already provided in many UAV control stations to control the degree to which flight symbology is portrayed. A similar function can be applied to the synthetic vision system whereby the operator can control the amount of synthetic information portrayed. However, research is needed on how best to implement decluttering modes for different UAV applications. More global levels of declutter may be optimal, whereby the operator can systematically select and deselect classes of information. For instance, perhaps only threat information is desired with no other synthetic information. Another approach would allow operators to de-select individual symbology elements, for instance, those that might be adjacent to a target that the operator needs to have an unobstructed view. 3.5 Information view management Providing the operator with the ability to declutter the synthetic vision system symbology is one method of managing how information is presented. However, there are numerous other techniques for view management which maintain visual constraints on the projections of objects on the display 9. With appropriate algorithms, the system can prevent objects from occluding each other, by modifying selected object properties such as position, size, and transparency. By making adjustments in the manner in which synthetic vision symbology is presented, problems with different synthetic elements occluding each other can be minimized as well as a synthetic element occluding a key element in the real sensor image. Likewise, an intelligent system can ensure that distant text does not become illegible and labels are automatically reoriented and repositioned based on the operator s viewpoint with respect to the object. Research is needed to identify the algorithms of highest utility for the task at hand. Besides managing the view to optimize the visibility of the synthetic vision symbology, an intelligent system can highlight in some fashion when new synthetic elements appear that are critical for operator attention. Conversely, the capability to retrieve dated information might be useful, for instance to review past flight paths or conduct battle damage assessment. The system can also help the operator retain spatial context with respect to the overall situation by interpolating between old and new viewpoints over a transitional period of a few seconds, slowing down the rate of transition 10. Evaluation is required to see if this is a benefit to spatial awareness, outweighing the costs of less responsive direct camera control. Identifying useful coding methods to indicate the criticality, urgency, and timeliness of information depicted by elements is another research topic. 3.6 Effect of synthetic vision symbology on retrieval of non-synthetic information. Cognitive tunneling can occur when the operator becomes focused on an element of the synthetic vision symbology (or objects to which attention is directed by the synthetic symbology) to such an extent that other important objects or events in the sensor imagery are not attended 11. In the case of UAVs, this may result in the operator not detecting unexpected, high-interest targets. Scene-linking the synthetic vision symbology may reduce the incidence of cognitive tunneling 12. With scene-linking, the augmented information is integrated into the visual scene, rather than superimposed. It is thought that scene-linking helps by grouping the synthetic information and real sensor information into one perceptual group, thus reducing problems associated with attentional allocation. (This is based on object-based models of visual attention that postulate that complex scenes are parsed into groups of objects, with attention focused on only one object at a time, with object groups defined by contours, color, etc. 13 ). However, increasing the amount of information presented via synthetic vision overlays could increase the risk of cognitive tunneling by the operator. Cognitive tunneling is also an issue for the PIP display concept for synthetic vision systems. With PIP, real video imagery is surrounded by a synthetic view, thereby virtually increasing the field of view visible to the operator. There is past research evaluating the use of concurrent exocentric maps for improving localization performance. In this case, the embedded map was opaque, and the operator could pan the insert to see the view behind it 14. It is not clear whether the PIP concept will constitute a different perceptual group, and thus promote cognitive tunneling. The fact that the surrounding view is an extension of the scene depicted within the PIP and that the PIP s transparency can be manipulated, may help perceptual grouping of the two scenes. Experimental evaluations are underway to determine this.

10 3.7 Blending of synthetic vision display and sensor image. One advantage of a synthetic vision system is its potential to provide mission information when video datalink is degraded or the visibility is limited. At a maximum setting, the synthetic vision imagery could totally replace the sensor image, while other settings could specify a blending of the two information sources by changing the transparency of the entire synthetic vision symbology set. Research issues include determining suitable methods to invoke imagery blending (discrete steps versus continuous control) and which type of terrain overlay is most suitable for blending. Blending techniques may be appropriate for individual symbology elements as well. For instance, gradual blending of the real and synthetic information along the edges of the object in the sensor image may help create a smooth transition between the synthetic and real objects at the points where occlusion occurs or there is an error in registration. 3.8 Distributed network collaborative communication of synthetic vision system information. It is plausible that a synthetic vision system can play a key role in supporting distributive collaborative communication in the net-centric environment envisioned for the UAV domain. Besides providing a common operating picture of available battlespace information, one individual could mark a specific spatially referenced point of interest on a work station, causing duplicate informative synthetic symbology to appear on the displays of other geographically separated stations in the warfare network. Thus, the synthetic vision system can be applied both as a display and as a control. To date, little research has addressed the many issues associated with implementing such a capability. For instance, one question is how best to keep each network member informed on the status of a new designation its source, status of coordination from others, timeliness, priority, etc. How should far off objects, beyond one s normal line-or-sight or off-boresight be represented? What methods are suitable for teamwork and planning? 3.9 Reliability of information. Synthetic vision systems are based on data drawn from one or more data sources and the reliability, accuracy, and currency of that information will vary. Additionally, a source may be reliable for one type of information but less reliable for other information types. It may be useful for UAV operators to be able to drill down to obtain knowledge of the data source for specific elements, to help judge the veracity of the data. It may also be possible to implement algorithms that weight the reliability of information and portray the certainly level with some type of coding method Adequacy of the performance of the synthetic vision system. Objects in the synthetic world and real world must be properly aligned (i.e., registered) with respect to each other on the display, or the illusion that the two worlds coexist will be compromised 15. If registration errors are systematic, operators might be able to adapt. Indeed, that is one research question: How much registration error is tolerable for a UAV application before task performance degrades substantially? Likewise, how much time delay can an operator tolerate? The time delay discussed here refers to the time difference between the measurement of the position and orientation of the sensor viewpoint to the moment when the synthetic image corresponding to that position and orientation appears in the display. Delays can cause registration errors and reduce task performance. There are several points in the overall system that contribute to both time delay and registration error, as well as make it likely that the problems will be variable. Perhaps the most detrimental to the performance of a synthetic vision system is the update rate and accuracy of the flight data. The quality of the UAV positional data, for instance, is subject to quantization error, random delays, and basic measurement error, besides problems introduced by the telemetry system. Advances in prediction algorithm design may help overcome the limitations of imprecise and tardy data input to the synthetic vision system. Manual intervention should also be enabled whereby the operator can dynamically recalibrate the correspondence of the synthetic and real worlds Operator control of synthetic vision system functions. The preceding subsections delineate many capabilities that could be implemented, along with the synthetic vision system, to allow the operator to modify the symbology, e.g., amount of symbology presented, characteristics of the picture-in-picture, and features of the distributed communication system. For each of these candidate functions, the

11 ideal control interface needs to be specified. The UAV operators conventional controllers (keyboard, mouse, bezel switches, and joysticks to control camera zoom and direction and UAV flight) need to be examined as to how best to integrate these additional control requirements. At AFRL, speech-based control is being considered whereby the operator s speech signals are used to carry out preset activities EVALUATION OF SYNTHETIC VISION SYSTEM FOR UAVS The human factors issues raised in Section 3 demonstrate that there are many research questions relative to the application of synthetic vision systems to UAVs. What isn t reflected in this section is the potential interaction of variables. A candidate symbology concept may only be beneficial if clutter level is low, visibility is good, or there is minimal image motion. Or individual symbology concepts may show a benefit, but when they are implemented together in a system, operator performance degrades due to unacceptable clutter, etc. Additionally, many human factors guidelines will be application-specific. Thus, evaluations are needed that not only focus on specific research issues, but also evaluate the application of a total candidate synthetic vision system in several different UAV task environments. The end goal is to determine if the synthetic vision system will benefit UAV operations and result in increased mission effectiveness. This confirmation involves several different types of evaluations, described below, many of which can be performed in parallel. 4.1 Usability Evaluations Evaluations employing usability engineering tools enable a rapid design/evaluation/iteration cycle to identify promising synthetic vision system symbology concepts for UAV applications. Such an evaluation was conducted as part of the AFRL/Rapid Imaging, Inc. collaborative effort 17. With this process, the most promising candidate concepts were identified, taking into account operator profiles, use case scenarios and function requirements. A critical design review of these concepts was then conducted with UAV operators, system developers, and human factors engineers, using a series of computer-generated illustrations of how the synthetic vision concepts would be implemented in the performance of the use-case scenario. This process was found to be very valuable in identifying the strengths and weaknesses of several specific symbology sets. The results of this usability evaluation are now being addressed in detail in simulation evaluations. 4.2 Simulation Evaluations The UAV control station simulation facility at Wright-Patterson Air Force Base (Figure 7) is being used to support a series of evaluations to address many of the issues identified in Section 3, utilizing the most promising symbology concepts identified in the usability evaluation. The high fidelity simulation environment allows for the adequate control of experimental conditions, manipulation of variables not possible in operational flight tests, and use of scripted, off-nominal events that either occur infrequently or are unsafe to test in a real-world environment. Events such as camera slew error can be inserted into the simulation in a manner that allows for complete repeatability across trials. Specifically, these evaluations will help identify optimal control techniques for information display, resolve issues of display clutter, and identify those concepts that result in highest mission performance. For example, the objectives of a study currently underway are to determine whether use of symbology flagging landmark locations and the picture-in-picture concept will speed designation of known targets without impacting the detection of unexpected targets. Figure 7. UAV sensor operator control station.

12 4.3 Flight Test Evaluations Flight demonstrations and flight tests are required to demonstrate that the synthetic vision symbology can be successfully integrated with the UAV platforms being targeted and that the performance requirements (3.10) can adequately be met. Flight tests can also confirm results from simulation evaluation that indicate that the synthetic vision symbology improves operator performance and does not negatively impact any operator tasking. In other words, flight tests validate whether the synthetic vision system can be successfully utilized and is beneficial in the intended UAV environment. An earlier version of the SmartCam3D synthetic vision system was successfully demonstrated on the NASA X- 38 vehicle during flight testing. During one of the flights, a control problem resulted in an unexpected 180 degree roll. Because the synthetic vision system offered improved situation awareness, operators watching the SmartCam system became aware of the problem long before the flight test engineers, who had to glean something was amiss from a display of six rapidly changing numbers. Results from these flight tests, as well as subsequent integration and tests, provide further support for the utility of a synthetic vision system. For the candidate synthetic vision system resulting from this AFRL/Rapid Imaging, Inc. collaborative effort, planning for flight tests is underway. Additionally, the system has already been successfully integrated in a UAV ground control ground control station and favorable comments have been received from operators. 4.3 Summary Synthetic vision system technology promises to enhance situation awareness for UAV operations, as well as decrease workload, improve network collaborative communication, and minimize effects of video datalink degradation. Operational benefits predicted include faster target acquisition and assessment, more targets serviced, and reduced potential for collateral damage. There are, however, numerous questions pertaining to the design, implementation, and integration of a synthetic vision system in UAV applications. REFERENCES 1. V.J. Gawron, Human factors issues in the development, evaluation and operations of uninhabited air vehicles. Proceedings of the Association for Unmanned Vehicle Systems International (AUVSI), 1998, K.M. Stanney (Ed.), Handbook of Virtual Environments: Design, Implementation, and Applications, Lawrence Erlbaum, New Jersey, C.D. Wickens, and J. Long, Object versus space-based models of visual attention: Implications for the design of head-up displays, Journal of Experimental Psychology: Applied, 1(3), 1995, P.M. Ververs, and C.D. Wickens, Conformal flight path symbology for head-up displays: defining the distribution of visual attention in three-dimensional space. Aviation Research Lab Institute of Aviation Technical Report ARL- 98-5/NASA-98-1, M. H. Draper, H.A. Ruff, J.V. Fontejon, and S.J Napier, The effects of head-coupled control and a head-mounted display (HMD) on large-area search tasks, Proceedings of the Human Factors and Ergonomic Society 46 th Annual Meeting, 2002, S. Fadden, P.M. Ververs, and C.D. Wickens, Costs and benefits of head-up display use: A meta-analytic approach, Proceedings of the Human Factors and Ergonomic Society 42 nd Annual Meeting, 1998, P. Kroft, and C.D. Wickens, Displaying multi-domain graphical data base information, Information Design Journal, 11(1), 2003, A. Stedmon, R. Kalawsky, K. Hill, and C. Cook, Old theories, new technologies: cumulative clutter effects using augmented reality, IEEE International Conference on Information Visualization 99, London, UK, July 1999, B. Bell, S. Feiner, and T. Hollerer, View Management for Virtual and Augmented Reality, ACM Symposium on User Interface Software and Technology, November 2001, R. Azuma, M. Daily, and J. Krozel, Advanced human-computer interfaces for air traffic management and simulation, Proceedings of the 1996 AIAA Flight Simulation Technology Conference, Reston, VA, July 1996,

13 11. M. Yeh, and C.D. Wickens, Display signaling in augmented reality: Effects of cue reliability and image realism on attention allocation and trust calibration, Human Factors, 43(3), 2001, J.L. Levy, D.C. Foyle, and R.S. McCann, Performance benefits with scene-linked HUD symbology: an attentional phenomenon? Proceedings of the 42 nd Human Factors and Ergonomics Society, 1998, R.S., McCann, and D.C. Foyle, Superimposed symbology: Attentional problems and design solutions, SAE Transactions: Journal of Aerospace, 103, 1994, C.D., Wickens, L.C. Thomas, and R. Young, Frames of reference for the display of battlefield information: Judgment-display dependencies, Human Factors, 42(4), 2000, R.T. Azuma, A survey of augmented reality, Presence: Teleoperators and virtual environments, 6(4), August 1997, M.H. Draper, G.L. Calhoun, D. Williamson, H.A Ruff, and T. Barry, Manual versus speech input for unmanned aerial vehicle control station operations, Proceedings of the Human Factors and Ergonomics Society, 2003, M. H. Draper, W.T. Nelson, M.F. Abernathy, and G.L. Calhoun, Synthetic vision overlay for improving UAV operations, Proceedings of the Association for Unmanned Vehicle Systems International (AUVSI), 2004.

Operational Domain Systems Engineering

Operational Domain Systems Engineering Operational Domain Systems Engineering J. Colombi, L. Anderson, P Doty, M. Griego, K. Timko, B Hermann Air Force Center for Systems Engineering Air Force Institute of Technology Wright-Patterson AFB OH

More information

Workshop Session #3: Human Interaction with Embedded Virtual Simulations Summary of Discussion

Workshop Session #3: Human Interaction with Embedded Virtual Simulations Summary of Discussion : Summary of Discussion This workshop session was facilitated by Dr. Thomas Alexander (GER) and Dr. Sylvain Hourlier (FRA) and focused on interface technology and human effectiveness including sensors

More information

Signal Processing Architectures for Ultra-Wideband Wide-Angle Synthetic Aperture Radar Applications

Signal Processing Architectures for Ultra-Wideband Wide-Angle Synthetic Aperture Radar Applications Signal Processing Architectures for Ultra-Wideband Wide-Angle Synthetic Aperture Radar Applications Atindra Mitra Joe Germann John Nehrbass AFRL/SNRR SKY Computers ASC/HPC High Performance Embedded Computing

More information

U.S. Army Training and Doctrine Command (TRADOC) Virtual World Project

U.S. Army Training and Doctrine Command (TRADOC) Virtual World Project U.S. Army Research, Development and Engineering Command U.S. Army Training and Doctrine Command (TRADOC) Virtual World Project Advanced Distributed Learning Co-Laboratory ImplementationFest 2010 12 August

More information

Innovative 3D Visualization of Electro-optic Data for MCM

Innovative 3D Visualization of Electro-optic Data for MCM Innovative 3D Visualization of Electro-optic Data for MCM James C. Luby, Ph.D., Applied Physics Laboratory University of Washington 1013 NE 40 th Street Seattle, Washington 98105-6698 Telephone: 206-543-6854

More information

Army Acoustics Needs

Army Acoustics Needs Army Acoustics Needs DARPA Air-Coupled Acoustic Micro Sensors Workshop by Nino Srour Aug 25, 1999 US Attn: AMSRL-SE-SA 2800 Powder Mill Road Adelphi, MD 20783-1197 Tel: (301) 394-2623 Email: nsrour@arl.mil

More information

MONITORING RUBBLE-MOUND COASTAL STRUCTURES WITH PHOTOGRAMMETRY

MONITORING RUBBLE-MOUND COASTAL STRUCTURES WITH PHOTOGRAMMETRY ,. CETN-III-21 2/84 MONITORING RUBBLE-MOUND COASTAL STRUCTURES WITH PHOTOGRAMMETRY INTRODUCTION: Monitoring coastal projects usually involves repeated surveys of coastal structures and/or beach profiles.

More information

Improving the Detection of Near Earth Objects for Ground Based Telescopes

Improving the Detection of Near Earth Objects for Ground Based Telescopes Improving the Detection of Near Earth Objects for Ground Based Telescopes Anthony O'Dell Captain, United States Air Force Air Force Research Laboratories ABSTRACT Congress has mandated the detection of

More information

IRTSS MODELING OF THE JCCD DATABASE. November Steve Luker AFRL/VSBE Hanscom AFB, MA And

IRTSS MODELING OF THE JCCD DATABASE. November Steve Luker AFRL/VSBE Hanscom AFB, MA And Approved for public release; distribution is unlimited IRTSS MODELING OF THE JCCD DATABASE November 1998 Steve Luker AFRL/VSBE Hanscom AFB, MA 01731 And Randall Williams JCCD Center, US Army WES Vicksburg,

More information

GLOBAL POSITIONING SYSTEM SHIPBORNE REFERENCE SYSTEM

GLOBAL POSITIONING SYSTEM SHIPBORNE REFERENCE SYSTEM GLOBAL POSITIONING SYSTEM SHIPBORNE REFERENCE SYSTEM James R. Clynch Department of Oceanography Naval Postgraduate School Monterey, CA 93943 phone: (408) 656-3268, voice-mail: (408) 656-2712, e-mail: clynch@nps.navy.mil

More information

See highlights on pages 1, 2 and 5

See highlights on pages 1, 2 and 5 See highlights on pages 1, 2 and 5 Dowell, S.R., Foyle, D.C., Hooey, B.L. & Williams, J.L. (2002). Paper to appear in the Proceedings of the 46 th Annual Meeting of the Human Factors and Ergonomic Society.

More information

A Stepped Frequency CW SAR for Lightweight UAV Operation

A Stepped Frequency CW SAR for Lightweight UAV Operation UNCLASSIFIED/UNLIMITED A Stepped Frequency CW SAR for Lightweight UAV Operation ABSTRACT Dr Keith Morrison Department of Aerospace, Power and Sensors University of Cranfield, Shrivenham Swindon, SN6 8LA

More information

Technology Maturation Planning for the Autonomous Approach and Landing Capability (AALC) Program

Technology Maturation Planning for the Autonomous Approach and Landing Capability (AALC) Program Technology Maturation Planning for the Autonomous Approach and Landing Capability (AALC) Program AFRL 2008 Technology Maturity Conference Multi-Dimensional Assessment of Technology Maturity 9-12 September

More information

AFRL-RH-WP-TR Image Fusion Techniques: Final Report for Task Order 009 (TO9)

AFRL-RH-WP-TR Image Fusion Techniques: Final Report for Task Order 009 (TO9) AFRL-RH-WP-TR-201 - Image Fusion Techniques: Final Report for Task Order 009 (TO9) Ron Dallman, Jeff Doyal Ball Aerospace & Technologies Corporation Systems Engineering Solutions May 2010 Final Report

More information

Acoustic Change Detection Using Sources of Opportunity

Acoustic Change Detection Using Sources of Opportunity Acoustic Change Detection Using Sources of Opportunity by Owen R. Wolfe and Geoffrey H. Goldman ARL-TN-0454 September 2011 Approved for public release; distribution unlimited. NOTICES Disclaimers The findings

More information

Radar Detection of Marine Mammals

Radar Detection of Marine Mammals DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. Radar Detection of Marine Mammals Charles P. Forsyth Areté Associates 1550 Crystal Drive, Suite 703 Arlington, VA 22202

More information

Combining High Dynamic Range Photography and High Range Resolution RADAR for Pre-discharge Threat Cues

Combining High Dynamic Range Photography and High Range Resolution RADAR for Pre-discharge Threat Cues Combining High Dynamic Range Photography and High Range Resolution RADAR for Pre-discharge Threat Cues Nikola Subotic Nikola.Subotic@mtu.edu DISTRIBUTION STATEMENT A. Approved for public release; distribution

More information

A RENEWED SPIRIT OF DISCOVERY

A RENEWED SPIRIT OF DISCOVERY A RENEWED SPIRIT OF DISCOVERY The President s Vision for U.S. Space Exploration PRESIDENT GEORGE W. BUSH JANUARY 2004 Report Documentation Page Form Approved OMB No. 0704-0188 Public reporting burden for

More information

preface Motivation Figure 1. Reality-virtuality continuum (Milgram & Kishino, 1994) Mixed.Reality Augmented. Virtuality Real...

preface Motivation Figure 1. Reality-virtuality continuum (Milgram & Kishino, 1994) Mixed.Reality Augmented. Virtuality Real... v preface Motivation Augmented reality (AR) research aims to develop technologies that allow the real-time fusion of computer-generated digital content with the real world. Unlike virtual reality (VR)

More information

UNCLASSIFIED UNCLASSIFIED 1

UNCLASSIFIED UNCLASSIFIED 1 UNCLASSIFIED 1 Report Documentation Page Form Approved OMB No. 0704-0188 Public reporting burden for the collection of information is estimated to average 1 hour per response, including the time for reviewing

More information

Bistatic Underwater Optical Imaging Using AUVs

Bistatic Underwater Optical Imaging Using AUVs Bistatic Underwater Optical Imaging Using AUVs Michael P. Strand Naval Surface Warfare Center Panama City Code HS-12, 110 Vernon Avenue Panama City, FL 32407 phone: (850) 235-5457 fax: (850) 234-4867 email:

More information

Virtual Reality Devices in C2 Systems

Virtual Reality Devices in C2 Systems Jan Hodicky, Petr Frantis University of Defence Brno 65 Kounicova str. Brno Czech Republic +420973443296 jan.hodicky@unbo.cz petr.frantis@unob.cz Virtual Reality Devices in C2 Systems Topic: Track 8 C2

More information

SPOT 5 / HRS: a key source for navigation database

SPOT 5 / HRS: a key source for navigation database SPOT 5 / HRS: a key source for navigation database CONTENT DEM and satellites SPOT 5 and HRS : the May 3 rd 2002 revolution Reference3D : a tool for navigation and simulation Marc BERNARD Page 1 Report

More information

Coherent distributed radar for highresolution

Coherent distributed radar for highresolution . Calhoun Drive, Suite Rockville, Maryland, 8 () 9 http://www.i-a-i.com Intelligent Automation Incorporated Coherent distributed radar for highresolution through-wall imaging Progress Report Contract No.

More information

COGNITIVE TUNNELING IN HEAD-UP DISPLAY (HUD) SUPERIMPOSED SYMBOLOGY: EFFECTS OF INFORMATION LOCATION

COGNITIVE TUNNELING IN HEAD-UP DISPLAY (HUD) SUPERIMPOSED SYMBOLOGY: EFFECTS OF INFORMATION LOCATION Foyle, D.C., Dowell, S.R. and Hooey, B.L. (2001). In R. S. Jensen, L. Chang, & K. Singleton (Eds.), Proceedings of the Eleventh International Symposium on Aviation Psychology, 143:1-143:6. Columbus, Ohio:

More information

Mathematics, Information, and Life Sciences

Mathematics, Information, and Life Sciences Mathematics, Information, and Life Sciences 05 03 2012 Integrity Service Excellence Dr. Hugh C. De Long Interim Director, RSL Air Force Office of Scientific Research Air Force Research Laboratory 15 February

More information

2006 CCRTS THE STATE OF THE ART AND THE STATE OF THE PRACTICE. Network on Target: Remotely Configured Adaptive Tactical Networks. C2 Experimentation

2006 CCRTS THE STATE OF THE ART AND THE STATE OF THE PRACTICE. Network on Target: Remotely Configured Adaptive Tactical Networks. C2 Experimentation 2006 CCRTS THE STATE OF THE ART AND THE STATE OF THE PRACTICE Network on Target: Remotely Configured Adaptive Tactical Networks C2 Experimentation Alex Bordetsky Eugene Bourakov Center for Network Innovation

More information

Effective Iconography....convey ideas without words; attract attention...

Effective Iconography....convey ideas without words; attract attention... Effective Iconography...convey ideas without words; attract attention... Visual Thinking and Icons An icon is an image, picture, or symbol representing a concept Icon-specific guidelines Represent the

More information

INTEGRATIVE MIGRATORY BIRD MANAGEMENT ON MILITARY BASES: THE ROLE OF RADAR ORNITHOLOGY

INTEGRATIVE MIGRATORY BIRD MANAGEMENT ON MILITARY BASES: THE ROLE OF RADAR ORNITHOLOGY INTEGRATIVE MIGRATORY BIRD MANAGEMENT ON MILITARY BASES: THE ROLE OF RADAR ORNITHOLOGY Sidney A. Gauthreaux, Jr. and Carroll G. Belser Department of Biological Sciences Clemson University Clemson, SC 29634-0314

More information

Best Practices for Technology Transition. Technology Maturity Conference September 12, 2007

Best Practices for Technology Transition. Technology Maturity Conference September 12, 2007 Best Practices for Technology Transition Technology Maturity Conference September 12, 2007 1 Report Documentation Page Form Approved OMB No. 0704-0188 Public reporting burden for the collection of information

More information

Non-Data Aided Doppler Shift Estimation for Underwater Acoustic Communication

Non-Data Aided Doppler Shift Estimation for Underwater Acoustic Communication Non-Data Aided Doppler Shift Estimation for Underwater Acoustic Communication (Invited paper) Paul Cotae (Corresponding author) 1,*, Suresh Regmi 1, Ira S. Moskowitz 2 1 University of the District of Columbia,

More information

AFRL-VA-WP-TP

AFRL-VA-WP-TP AFRL-VA-WP-TP-7-31 PROPORTIONAL NAVIGATION WITH ADAPTIVE TERMINAL GUIDANCE FOR AIRCRAFT RENDEZVOUS (PREPRINT) Austin L. Smith FEBRUARY 7 Approved for public release; distribution unlimited. STINFO COPY

More information

Optimal Exploitation of 3D Electro-Optic Identification Sensors for Mine Countermeasures

Optimal Exploitation of 3D Electro-Optic Identification Sensors for Mine Countermeasures Optimal Exploitation of 3D Electro-Optic Identification Sensors for Mine Countermeasures Russell J. Hilton Areté Associates 115 Bailey Drive Niceville, FL 32578 Phone: (850) 729-2130x101 Fax: (850) 729-1807

More information

Robotics and Artificial Intelligence. Rodney Brooks Director, MIT Computer Science and Artificial Intelligence Laboratory CTO, irobot Corp

Robotics and Artificial Intelligence. Rodney Brooks Director, MIT Computer Science and Artificial Intelligence Laboratory CTO, irobot Corp Robotics and Artificial Intelligence Rodney Brooks Director, MIT Computer Science and Artificial Intelligence Laboratory CTO, irobot Corp Report Documentation Page Form Approved OMB No. 0704-0188 Public

More information

The Lessons Learned in the Application of Augmented Reality

The Lessons Learned in the Application of Augmented Reality Mr Matthew Franklin QinetiQ Ltd. Cody Technology Park Farnborough Hampshire GU51 0LX UNITED KINGDOM Tel: (+44) 1252 393232 Fax: (+44) 1252 396406 Email: MFRANKLIN@QinetiQ.com Website: www.qinetiq.com ABSTRACT

More information

Durable Aircraft. February 7, 2011

Durable Aircraft. February 7, 2011 Durable Aircraft February 7, 2011 Report Documentation Page Form Approved OMB No. 0704-0188 Public reporting burden for the collection of information is estimated to average 1 hour per response, including

More information

The Algorithm Theoretical Basis Document for the Atmospheric Delay Correction to GLAS Laser Altimeter Ranges

The Algorithm Theoretical Basis Document for the Atmospheric Delay Correction to GLAS Laser Altimeter Ranges NASA/TM 2012-208641 / Vol 8 ICESat (GLAS) Science Processing Software Document Series The Algorithm Theoretical Basis Document for the Atmospheric Delay Correction to GLAS Laser Altimeter Ranges Thomas

More information

Tracking Moving Ground Targets from Airborne SAR via Keystoning and Multiple Phase Center Interferometry

Tracking Moving Ground Targets from Airborne SAR via Keystoning and Multiple Phase Center Interferometry Tracking Moving Ground Targets from Airborne SAR via Keystoning and Multiple Phase Center Interferometry P. K. Sanyal, D. M. Zasada, R. P. Perry The MITRE Corp., 26 Electronic Parkway, Rome, NY 13441,

More information

Psychophysics of night vision device halo

Psychophysics of night vision device halo University of Wollongong Research Online Faculty of Health and Behavioural Sciences - Papers (Archive) Faculty of Science, Medicine and Health 2009 Psychophysics of night vision device halo Robert S Allison

More information

COM DEV AIS Initiative. TEXAS II Meeting September 03, 2008 Ian D Souza

COM DEV AIS Initiative. TEXAS II Meeting September 03, 2008 Ian D Souza COM DEV AIS Initiative TEXAS II Meeting September 03, 2008 Ian D Souza 1 Report Documentation Page Form Approved OMB No. 0704-0188 Public reporting burden for the collection of information is estimated

More information

Academia. Elizabeth Mezzacappa, Ph.D. & Kenneth Short, Ph.D. Target Behavioral Response Laboratory (973)

Academia. Elizabeth Mezzacappa, Ph.D. & Kenneth Short, Ph.D. Target Behavioral Response Laboratory (973) Subject Matter Experts from Academia Elizabeth Mezzacappa, Ph.D. & Kenneth Short, Ph.D. Stress and Motivated Behavior Institute, UMDNJ/NJMS Target Behavioral Response Laboratory (973) 724-9494 elizabeth.mezzacappa@us.army.mil

More information

AFRL-RH-WP-TR

AFRL-RH-WP-TR AFRL-RH-WP-TR-2014-0006 Graphed-based Models for Data and Decision Making Dr. Leslie Blaha January 2014 Interim Report Distribution A: Approved for public release; distribution is unlimited. See additional

More information

PULSED BREAKDOWN CHARACTERISTICS OF HELIUM IN PARTIAL VACUUM IN KHZ RANGE

PULSED BREAKDOWN CHARACTERISTICS OF HELIUM IN PARTIAL VACUUM IN KHZ RANGE PULSED BREAKDOWN CHARACTERISTICS OF HELIUM IN PARTIAL VACUUM IN KHZ RANGE K. Koppisetty ξ, H. Kirkici Auburn University, Auburn, Auburn, AL, USA D. L. Schweickart Air Force Research Laboratory, Wright

More information

Transitioning the Opportune Landing Site System to Initial Operating Capability

Transitioning the Opportune Landing Site System to Initial Operating Capability Transitioning the Opportune Landing Site System to Initial Operating Capability AFRL s s 2007 Technology Maturation Conference Multi-Dimensional Assessment of Technology Maturity 13 September 2007 Presented

More information

Automatic Payload Deployment System (APDS)

Automatic Payload Deployment System (APDS) Automatic Payload Deployment System (APDS) Brian Suh Director, T2 Office WBT Innovation Marketplace 2012 Report Documentation Page Form Approved OMB No. 0704-0188 Public reporting burden for the collection

More information

Modeling Antennas on Automobiles in the VHF and UHF Frequency Bands, Comparisons of Predictions and Measurements

Modeling Antennas on Automobiles in the VHF and UHF Frequency Bands, Comparisons of Predictions and Measurements Modeling Antennas on Automobiles in the VHF and UHF Frequency Bands, Comparisons of Predictions and Measurements Nicholas DeMinco Institute for Telecommunication Sciences U.S. Department of Commerce Boulder,

More information

FAA Research and Development Efforts in SHM

FAA Research and Development Efforts in SHM FAA Research and Development Efforts in SHM P. SWINDELL and D. P. ROACH ABSTRACT SHM systems are being developed using networks of sensors for the continuous monitoring, inspection and damage detection

More information

Helicopter Aerial Laser Ranging

Helicopter Aerial Laser Ranging Helicopter Aerial Laser Ranging Håkan Sterner TopEye AB P.O.Box 1017, SE-551 11 Jönköping, Sweden 1 Introduction Measuring distances with light has been used for terrestrial surveys since the fifties.

More information

Department of Energy Technology Readiness Assessments Process Guide and Training Plan

Department of Energy Technology Readiness Assessments Process Guide and Training Plan Department of Energy Technology Readiness Assessments Process Guide and Training Plan Steven Krahn, Kurt Gerdes Herbert Sutter Department of Energy Consultant, Department of Energy 2008 Technology Maturity

More information

A New Scheme for Acoustical Tomography of the Ocean

A New Scheme for Acoustical Tomography of the Ocean A New Scheme for Acoustical Tomography of the Ocean Alexander G. Voronovich NOAA/ERL/ETL, R/E/ET1 325 Broadway Boulder, CO 80303 phone (303)-497-6464 fax (303)-497-3577 email agv@etl.noaa.gov E.C. Shang

More information

Sea Surface Backscatter Distortions of Scanning Radar Altimeter Ocean Wave Measurements

Sea Surface Backscatter Distortions of Scanning Radar Altimeter Ocean Wave Measurements Sea Surface Backscatter Distortions of Scanning Radar Altimeter Ocean Wave Measurements Edward J. Walsh and C. Wayne Wright NASA Goddard Space Flight Center Wallops Flight Facility Wallops Island, VA 23337

More information

14. Model Based Systems Engineering: Issues of application to Soft Systems

14. Model Based Systems Engineering: Issues of application to Soft Systems DSTO-GD-0734 14. Model Based Systems Engineering: Issues of application to Soft Systems Ady James, Alan Smith and Michael Emes UCL Centre for Systems Engineering, Mullard Space Science Laboratory Abstract

More information

The EDA SUM Project. Surveillance in an Urban environment using Mobile sensors. 2012, September 13 th - FMV SENSORS SYMPOSIUM 2012

The EDA SUM Project. Surveillance in an Urban environment using Mobile sensors. 2012, September 13 th - FMV SENSORS SYMPOSIUM 2012 Surveillance in an Urban environment using Mobile sensors 2012, September 13 th - FMV SENSORS SYMPOSIUM 2012 TABLE OF CONTENTS European Defence Agency Supported Project 1. SUM Project Description. 2. Subsystems

More information

AN INSTRUMENTED FLIGHT TEST OF FLAPPING MICRO AIR VEHICLES USING A TRACKING SYSTEM

AN INSTRUMENTED FLIGHT TEST OF FLAPPING MICRO AIR VEHICLES USING A TRACKING SYSTEM 18 TH INTERNATIONAL CONFERENCE ON COMPOSITE MATERIALS AN INSTRUMENTED FLIGHT TEST OF FLAPPING MICRO AIR VEHICLES USING A TRACKING SYSTEM J. H. Kim 1*, C. Y. Park 1, S. M. Jun 1, G. Parker 2, K. J. Yoon

More information

A Comparison of Two Computational Technologies for Digital Pulse Compression

A Comparison of Two Computational Technologies for Digital Pulse Compression A Comparison of Two Computational Technologies for Digital Pulse Compression Presented by Michael J. Bonato Vice President of Engineering Catalina Research Inc. A Paravant Company High Performance Embedded

More information

Underwater Intelligent Sensor Protection System

Underwater Intelligent Sensor Protection System Underwater Intelligent Sensor Protection System Peter J. Stein, Armen Bahlavouni Scientific Solutions, Inc. 18 Clinton Drive Hollis, NH 03049-6576 Phone: (603) 880-3784, Fax: (603) 598-1803, email: pstein@mv.mv.com

More information

SA Joint USN/USMC Spectrum Conference. Gerry Fitzgerald. Organization: G036 Project: 0710V250-A1

SA Joint USN/USMC Spectrum Conference. Gerry Fitzgerald. Organization: G036 Project: 0710V250-A1 SA2 101 Joint USN/USMC Spectrum Conference Gerry Fitzgerald 04 MAR 2010 DISTRIBUTION A: Approved for public release Case 10-0907 Organization: G036 Project: 0710V250-A1 Report Documentation Page Form Approved

More information

RADAR SATELLITES AND MARITIME DOMAIN AWARENESS

RADAR SATELLITES AND MARITIME DOMAIN AWARENESS RADAR SATELLITES AND MARITIME DOMAIN AWARENESS J.K.E. Tunaley Corporation, 114 Margaret Anne Drive, Ottawa, Ontario K0A 1L0 (613) 839-7943 Report Documentation Page Form Approved OMB No. 0704-0188 Public

More information

POSTPRINT UNITED STATES AIR FORCE RESEARCH ON AIRFIELD PAVEMENT REPAIRS USING PRECAST PORTLAND CEMENT CONCRETE (PCC) SLABS (BRIEFING SLIDES)

POSTPRINT UNITED STATES AIR FORCE RESEARCH ON AIRFIELD PAVEMENT REPAIRS USING PRECAST PORTLAND CEMENT CONCRETE (PCC) SLABS (BRIEFING SLIDES) POSTPRINT AFRL-RX-TY-TP-2008-4582 UNITED STATES AIR FORCE RESEARCH ON AIRFIELD PAVEMENT REPAIRS USING PRECAST PORTLAND CEMENT CONCRETE (PCC) SLABS (BRIEFING SLIDES) Athar Saeed, PhD, PE Applied Research

More information

RF Performance Predictions for Real Time Shipboard Applications

RF Performance Predictions for Real Time Shipboard Applications DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. RF Performance Predictions for Real Time Shipboard Applications Dr. Richard Sprague SPAWARSYSCEN PACIFIC 5548 Atmospheric

More information

Key Issues in Modulating Retroreflector Technology

Key Issues in Modulating Retroreflector Technology Key Issues in Modulating Retroreflector Technology Dr. G. Charmaine Gilbreath, Code 7120 Naval Research Laboratory 4555 Overlook Ave., NW Washington, DC 20375 phone: (202) 767-0170 fax: (202) 404-8894

More information

Inertial Navigation/Calibration/Precise Time and Frequency Capabilities Larry M. Galloway and James F. Barnaba Newark Air Force Station, Ohio

Inertial Navigation/Calibration/Precise Time and Frequency Capabilities Larry M. Galloway and James F. Barnaba Newark Air Force Station, Ohio AEROSPACE GUIDANCE AND METROLOGY CENTER (AGMC) Inertial Navigation/Calibration/Precise Time and Frequency Capabilities Larry M. Galloway and James F. Barnaba Newark Air Force Station, Ohio ABSTRACT The

More information

Modeling and Evaluation of Bi-Static Tracking In Very Shallow Water

Modeling and Evaluation of Bi-Static Tracking In Very Shallow Water Modeling and Evaluation of Bi-Static Tracking In Very Shallow Water Stewart A.L. Glegg Dept. of Ocean Engineering Florida Atlantic University Boca Raton, FL 33431 Tel: (954) 924 7241 Fax: (954) 924-7270

More information

EnVis and Hector Tools for Ocean Model Visualization LONG TERM GOALS OBJECTIVES

EnVis and Hector Tools for Ocean Model Visualization LONG TERM GOALS OBJECTIVES EnVis and Hector Tools for Ocean Model Visualization Robert Moorhead and Sam Russ Engineering Research Center Mississippi State University Miss. State, MS 39759 phone: (601) 325 8278 fax: (601) 325 7692

More information

Electro-Optic Identification Research Program: Computer Aided Identification (CAI) and Automatic Target Recognition (ATR)

Electro-Optic Identification Research Program: Computer Aided Identification (CAI) and Automatic Target Recognition (ATR) Electro-Optic Identification Research Program: Computer Aided Identification (CAI) and Automatic Target Recognition (ATR) Phone: (850) 234-4066 Phone: (850) 235-5890 James S. Taylor, Code R22 Coastal Systems

More information

DIELECTRIC ROTMAN LENS ALTERNATIVES FOR BROADBAND MULTIPLE BEAM ANTENNAS IN MULTI-FUNCTION RF APPLICATIONS. O. Kilic U.S. Army Research Laboratory

DIELECTRIC ROTMAN LENS ALTERNATIVES FOR BROADBAND MULTIPLE BEAM ANTENNAS IN MULTI-FUNCTION RF APPLICATIONS. O. Kilic U.S. Army Research Laboratory DIELECTRIC ROTMAN LENS ALTERNATIVES FOR BROADBAND MULTIPLE BEAM ANTENNAS IN MULTI-FUNCTION RF APPLICATIONS O. Kilic U.S. Army Research Laboratory ABSTRACT The U.S. Army Research Laboratory (ARL) is currently

More information

Customer Showcase > Defense and Intelligence

Customer Showcase > Defense and Intelligence Customer Showcase Skyline TerraExplorer is a critical visualization technology broadly deployed in defense and intelligence, public safety and security, 3D geoportals, and urban planning markets. It fuses

More information

Microsoft ESP Developer profile white paper

Microsoft ESP Developer profile white paper Microsoft ESP Developer profile white paper Reality XP Simulation www.reality-xp.com Background Microsoft ESP is a visual simulation platform that brings immersive games-based technology to training and

More information

Perspective View Displays and User Performance

Perspective View Displays and User Performance 186 Perspective View Displays and User Performance Michael B. Cowen SSC San Diego INTRODUCTION Objects and scenes displayed on a flat screen from a 30- to 60-degree perspective viewing angle can convey

More information

Investigation of a Forward Looking Conformal Broadband Antenna for Airborne Wide Area Surveillance

Investigation of a Forward Looking Conformal Broadband Antenna for Airborne Wide Area Surveillance Investigation of a Forward Looking Conformal Broadband Antenna for Airborne Wide Area Surveillance Hany E. Yacoub Department Of Electrical Engineering & Computer Science 121 Link Hall, Syracuse University,

More information

Issues and Challenges of 3D User Interfaces: Effects of Distraction

Issues and Challenges of 3D User Interfaces: Effects of Distraction Issues and Challenges of 3D User Interfaces: Effects of Distraction Leslie Klein kleinl@in.tum.de In time critical tasks like when driving a car or in emergency management, 3D user interfaces provide an

More information

VHF/UHF Imagery of Targets, Decoys, and Trees

VHF/UHF Imagery of Targets, Decoys, and Trees F/UHF Imagery of Targets, Decoys, and Trees A. J. Gatesman, C. Beaudoin, R. Giles, J. Waldman Submillimeter-Wave Technology Laboratory University of Massachusetts Lowell J.L. Poirier, K.-H. Ding, P. Franchi,

More information

Lattice Spacing Effect on Scan Loss for Bat-Wing Phased Array Antennas

Lattice Spacing Effect on Scan Loss for Bat-Wing Phased Array Antennas Lattice Spacing Effect on Scan Loss for Bat-Wing Phased Array Antennas I. Introduction Thinh Q. Ho*, Charles A. Hewett, Lilton N. Hunt SSCSD 2825, San Diego, CA 92152 Thomas G. Ready NAVSEA PMS500, Washington,

More information

Satellite Observations of Nonlinear Internal Waves and Surface Signatures in the South China Sea

Satellite Observations of Nonlinear Internal Waves and Surface Signatures in the South China Sea DISTRIBUTION STATEMENT A: Distribution approved for public release; distribution is unlimited Satellite Observations of Nonlinear Internal Waves and Surface Signatures in the South China Sea Hans C. Graber

More information

Sky Satellites: The Marine Corps Solution to its Over-The-Horizon Communication Problem

Sky Satellites: The Marine Corps Solution to its Over-The-Horizon Communication Problem Sky Satellites: The Marine Corps Solution to its Over-The-Horizon Communication Problem Subject Area Electronic Warfare EWS 2006 Sky Satellites: The Marine Corps Solution to its Over-The- Horizon Communication

More information

Target Behavioral Response Laboratory

Target Behavioral Response Laboratory Target Behavioral Response Laboratory APPROVED FOR PUBLIC RELEASE John Riedener Technical Director (973) 724-8067 john.riedener@us.army.mil Report Documentation Page Form Approved OMB No. 0704-0188 Public

More information

MERQ EVALUATION SYSTEM

MERQ EVALUATION SYSTEM UNCLASSIFIED MERQ EVALUATION SYSTEM Multi-Dimensional Assessment of Technology Maturity Conference 10 May 2006 Mark R. Dale Chief, Propulsion Branch Turbine Engine Division Propulsion Directorate Air Force

More information

Advancing Autonomy on Man Portable Robots. Brandon Sights SPAWAR Systems Center, San Diego May 14, 2008

Advancing Autonomy on Man Portable Robots. Brandon Sights SPAWAR Systems Center, San Diego May 14, 2008 Advancing Autonomy on Man Portable Robots Brandon Sights SPAWAR Systems Center, San Diego May 14, 2008 Report Documentation Page Form Approved OMB No. 0704-0188 Public reporting burden for the collection

More information

THE DET CURVE IN ASSESSMENT OF DETECTION TASK PERFORMANCE

THE DET CURVE IN ASSESSMENT OF DETECTION TASK PERFORMANCE THE DET CURVE IN ASSESSMENT OF DETECTION TASK PERFORMANCE A. Martin*, G. Doddington#, T. Kamm+, M. Ordowski+, M. Przybocki* *National Institute of Standards and Technology, Bldg. 225-Rm. A216, Gaithersburg,

More information

David Siegel Masters Student University of Cincinnati. IAB 17, May 5 7, 2009 Ford & UM

David Siegel Masters Student University of Cincinnati. IAB 17, May 5 7, 2009 Ford & UM Alternator Health Monitoring For Vehicle Applications David Siegel Masters Student University of Cincinnati Report Documentation Page Form Approved OMB No. 0704-0188 Public reporting burden for the collection

More information

AFRL-RI-RS-TR

AFRL-RI-RS-TR AFRL-RI-RS-TR-2015-012 ROBOTICS CHALLENGE: COGNITIVE ROBOT FOR GENERAL MISSIONS UNIVERSITY OF KANSAS JANUARY 2015 FINAL TECHNICAL REPORT APPROVED FOR PUBLIC RELEASE; DISTRIBUTION UNLIMITED STINFO COPY

More information

OPTICAL EMISSION CHARACTERISTICS OF HELIUM BREAKDOWN AT PARTIAL VACUUM FOR POINT TO PLANE GEOMETRY

OPTICAL EMISSION CHARACTERISTICS OF HELIUM BREAKDOWN AT PARTIAL VACUUM FOR POINT TO PLANE GEOMETRY OPTICAL EMISSION CHARACTERISTICS OF HELIUM BREAKDOWN AT PARTIAL VACUUM FOR POINT TO PLANE GEOMETRY K. Koppisetty ξ, H. Kirkici 1, D. L. Schweickart 2 1 Auburn University, Auburn, Alabama 36849, USA, 2

More information

Measurement of Ocean Spatial Coherence by Spaceborne Synthetic Aperture Radar

Measurement of Ocean Spatial Coherence by Spaceborne Synthetic Aperture Radar Measurement of Ocean Spatial Coherence by Spaceborne Synthetic Aperture Radar Frank Monaldo, Donald Thompson, and Robert Beal Ocean Remote Sensing Group Johns Hopkins University Applied Physics Laboratory

More information

Management of Toxic Materials in DoD: The Emerging Contaminants Program

Management of Toxic Materials in DoD: The Emerging Contaminants Program SERDP/ESTCP Workshop Carole.LeBlanc@osd.mil Surface Finishing and Repair Issues 703.604.1934 for Sustaining New Military Aircraft February 26-28, 2008, Tempe, Arizona Management of Toxic Materials in DoD:

More information

Ground Based GPS Phase Measurements for Atmospheric Sounding

Ground Based GPS Phase Measurements for Atmospheric Sounding Ground Based GPS Phase Measurements for Atmospheric Sounding Principal Investigator: Randolph Ware Co-Principal Investigator Christian Rocken UNAVCO GPS Science and Technology Program University Corporation

More information

3D Animation of Recorded Flight Data

3D Animation of Recorded Flight Data 3D Animation of Recorded Flight Data *Carole Bolduc **Wayne Jackson *Software Kinetics Ltd, 65 Iber Rd, Stittsville, Ontario, Canada K2S 1E7 Tel: (613) 831-0888, Email: Carole.Bolduc@SoftwareKinetics.ca

More information

See highlights on pages 1 and 5

See highlights on pages 1 and 5 See highlights on pages 1 and 5 Foyle, D.C., McCann, R.S. and Shelden, S.G. (1995). In R.S. Jensen & L.A. Rakovan (Eds.), Proceedings of the Eighth International Symposium on Aviation Psychology, 98-103.

More information

Investigation of Modulated Laser Techniques for Improved Underwater Imaging

Investigation of Modulated Laser Techniques for Improved Underwater Imaging Investigation of Modulated Laser Techniques for Improved Underwater Imaging Linda J. Mullen NAVAIR, EO and Special Mission Sensors Division 4.5.6, Building 2185 Suite 1100-A3, 22347 Cedar Point Road Unit

More information

Evanescent Acoustic Wave Scattering by Targets and Diffraction by Ripples

Evanescent Acoustic Wave Scattering by Targets and Diffraction by Ripples Evanescent Acoustic Wave Scattering by Targets and Diffraction by Ripples PI name: Philip L. Marston Physics Department, Washington State University, Pullman, WA 99164-2814 Phone: (509) 335-5343 Fax: (509)

More information

THE NATIONAL SHIPBUILDING RESEARCH PROGRAM

THE NATIONAL SHIPBUILDING RESEARCH PROGRAM SHIP PRODUCTION COMMITTEE FACILITIES AND ENVIRONMENTAL EFFECTS SURFACE PREPARATION AND COATINGS DESIGN/PRODUCTION INTEGRATION HUMAN RESOURCE INNOVATION MARINE INDUSTRY STANDARDS WELDING INDUSTRIAL ENGINEERING

More information

Improving Airport Planning & Development and Operations & Maintenance via Skyline 3D Software

Improving Airport Planning & Development and Operations & Maintenance via Skyline 3D Software Improving Airport Planning & Development and Operations & Maintenance via Skyline 3D Software By David Tamir, February 2014 Skyline Software Systems has pioneered web-enabled 3D information mapping and

More information

Coverage Metric for Acoustic Receiver Evaluation and Track Generation

Coverage Metric for Acoustic Receiver Evaluation and Track Generation Coverage Metric for Acoustic Receiver Evaluation and Track Generation Steven M. Dennis Naval Research Laboratory Stennis Space Center, MS 39529, USA Abstract-Acoustic receiver track generation has been

More information

Solar Radar Experiments

Solar Radar Experiments Solar Radar Experiments Paul Rodriguez Plasma Physics Division Naval Research Laboratory Washington, DC 20375 phone: (202) 767-3329 fax: (202) 767-3553 e-mail: paul.rodriguez@nrl.navy.mil Award # N0001498WX30228

More information

Challenges UAV operators face in maintaining spatial orientation Lee Gugerty Clemson University

Challenges UAV operators face in maintaining spatial orientation Lee Gugerty Clemson University Challenges UAV operators face in maintaining spatial orientation Lee Gugerty Clemson University Overview Task analysis of Predator UAV operations UAV synthetic task Spatial orientation challenges Data

More information

HIGH TEMPERATURE (250 C) SIC POWER MODULE FOR MILITARY HYBRID ELECTRICAL VEHICLE APPLICATIONS

HIGH TEMPERATURE (250 C) SIC POWER MODULE FOR MILITARY HYBRID ELECTRICAL VEHICLE APPLICATIONS HIGH TEMPERATURE (250 C) SIC POWER MODULE FOR MILITARY HYBRID ELECTRICAL VEHICLE APPLICATIONS R. M. Schupbach, B. McPherson, T. McNutt, A. B. Lostetter John P. Kajs, and Scott G Castagno 29 July 2011 :

More information

Assimilation Ionosphere Model

Assimilation Ionosphere Model Assimilation Ionosphere Model Robert W. Schunk Space Environment Corporation 399 North Main, Suite 325 Logan, UT 84321 phone: (435) 752-6567 fax: (435) 752-6687 email: schunk@spacenv.com Award #: N00014-98-C-0085

More information

Analysis of Handling Qualities Design Criteria for Active Inceptor Force-Feel Characteristics

Analysis of Handling Qualities Design Criteria for Active Inceptor Force-Feel Characteristics Analysis of Handling Qualities Design Criteria for Active Inceptor Force-Feel Characteristics Carlos A. Malpica NASA Ames Research Center Moffett Field, CA Jeff A. Lusardi Aeroflightdynamics Directorate

More information

REPORT DOCUMENTATION PAGE

REPORT DOCUMENTATION PAGE REPORT DOCUMENTATION PAGE Form Approved OMB No. 0704-0188 The public reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions,

More information

Social Science: Disciplined Study of the Social World

Social Science: Disciplined Study of the Social World Social Science: Disciplined Study of the Social World Elisa Jayne Bienenstock MORS Mini-Symposium Social Science Underpinnings of Complex Operations (SSUCO) 18-21 October 2010 Report Documentation Page

More information

South Atlantic Bight Synoptic Offshore Observational Network

South Atlantic Bight Synoptic Offshore Observational Network South Atlantic Bight Synoptic Offshore Observational Network Charlie Barans Marine Resources Division South Carolina Department of Natural Resources P.O. Box 12559 Charleston, SC 29422 phone: (843) 762-5084

More information