PROGRESS ON THE SIMULATOR AND EYE-TRACKER FOR ASSESSMENT OF PVFR ROUTES AND SNI OPERATIONS FOR ROTORCRAFT

Similar documents
Helicopter Aerial Laser Ranging

Application of 3D Terrain Representation System for Highway Landscape Design

Classical Control Based Autopilot Design Using PC/104

Sikorsky S-70i BLACK HAWK Training

Augmented Reality and Unmanned Aerial Vehicle Assist in Construction Management

Microsoft ESP Developer profile white paper

Cockpit GPS Quick Start Guide

DLR Project ADVISE-PRO Advanced Visual System for Situation Awareness Enhancement Prototype Introduction The Project ADVISE-PRO

Name of Customer Representative: n/a (program was funded by Rockwell Collins) Phone Number:

Chapter 10 Navigation

Miniature UAV Radar System April 28th, Developers: Allistair Moses Matthew J. Rutherford Michail Kontitsis Kimon P.

The Army s Future Tactical UAS Technology Demonstrator Program

Design of a Remote-Cockpit for small Aerospace Vehicles

Test and Integration of a Detect and Avoid System

vstasker 6 A COMPLETE MULTI-PURPOSE SOFTWARE TO SPEED UP YOUR SIMULATION PROJECT, FROM DESIGN TIME TO DEPLOYMENT REAL-TIME SIMULATION TOOLKIT FEATURES

A Nuclear Plume Detection and Tracking Model for the Advanced Airborne Early Warning Surveillance Aircraft

preface Motivation Figure 1. Reality-virtuality continuum (Milgram & Kishino, 1994) Mixed.Reality Augmented. Virtuality Real...

Autonomous Navigation of a Flying Vehicle on a Predefined Route

User Interfaces in Panoramic Augmented Reality Environments

Jager UAVs to Locate GPS Interference

GNSS Spectrum Issues and New GPS L5

AR 2 kanoid: Augmented Reality ARkanoid

A New Capability for Crash Site Documentation

A Hybrid Immersive / Non-Immersive

Texture characterization in DIRSIG

ClearVision Complete HUD and EFVS Solution

Lecture 19: Depth Cameras. Kayvon Fatahalian CMU : Graphics and Imaging Architectures (Fall 2011)

WRC-12 Implications for Terrestrial Services other than Mobile Broadband. John Mettrop BDT Expert. Scope

EE Chapter 14 Communication and Navigation Systems

Omni-Directional Catadioptric Acquisition System

EVALUATING VISUALIZATION MODES FOR CLOSELY-SPACED PARALLEL APPROACHES

Active and Passive Microwave Remote Sensing

Copyrighted Material - Taylor & Francis

Practical Image and Video Processing Using MATLAB

Visualization of Vehicular Traffic in Augmented Reality for Improved Planning and Analysis of Road Construction Projects

INTERIOUR DESIGN USING AUGMENTED REALITY

Guidance Material for ILS requirements in RSA

Introduction to Virtual Reality (based on a talk by Bill Mark)

Patents of eye tracking system- a survey

OPPORTUNISTIC TRAFFIC SENSING USING EXISTING VIDEO SOURCES (PHASE II)

RECOMMENDATION ITU-R SA.1624 *

Improving Airport Planning & Development and Operations & Maintenance via Skyline 3D Software

ELEVENTH AIR NAVIGATION CONFERENCE. Montreal, 22 September to 3 October 2003 TOOLS AND FUNCTIONS FOR GNSS RAIM/FDE AVAILABILITY DETERMINATION

Implementation of Adaptive Coded Aperture Imaging using a Digital Micro-Mirror Device for Defocus Deblurring

Potential Uses of Virtual and Augmented Reality Devices in Commercial Training Applications

Generic Experimental Cockpit (GECO)

NAVIGATIONAL CONTROL EFFECT ON REPRESENTING VIRTUAL ENVIRONMENTS

This page is intentionally blank. GARMIN G1000 SYNTHETIC VISION AND PATHWAYS OPTION Rev 1 Page 2 of 27

Psychophysics of night vision device halo

EYE MOVEMENT STRATEGIES IN NAVIGATIONAL TASKS Austin Ducworth, Melissa Falzetta, Lindsay Hyma, Katie Kimble & James Michalak Group 1

TECHNOLOGY COMMONALITY FOR SIMULATION TRAINING OF AIR COMBAT OFFICERS AND NAVAL HELICOPTER CONTROL OFFICERS


GEO 428: DEMs from GPS, Imagery, & Lidar Tuesday, September 11

Interactive Design/Decision Making in a Virtual Urban World: Visual Simulation and GIS

Admin. Today: Designing for Virtual Reality VR and 3D interfaces Interaction design for VR Prototyping for VR

OPTICAL CAMOUFLAGE. ¾ B.Tech E.C.E Shri Vishnu engineering college for women. Abstract

Active and Passive Microwave Remote Sensing

Hardware Modeling and Machining for UAV- Based Wideband Radar

REPORT ON THE CURRENT STATE OF FOR DESIGN. XL: Experiments in Landscape and Urbanism

A CLOSED-LOOP, ACT-R APPROACH TO MODELING APPROACH AND LANDING WITH AND WITHOUT SYNTHETIC VISION SYSTEM (SVS) TECHNOLOGY

Enhancing Shipboard Maintenance with Augmented Reality

Time-Lapse Panoramas for the Egyptian Heritage

2006 CCRTS THE STATE OF THE ART AND THE STATE OF THE PRACTICE. Network on Target: Remotely Configured Adaptive Tactical Networks. C2 Experimentation

The eyes: Windows into the successful and unsuccessful strategies used during helicopter navigation and target detection

Chapter 18 Optical Elements

Aerial Firefighting Europe SEILAF: Wildfirexperience

VCU Skyline. Team Members: Project Advisor: Dr. Robert Klenke. Last Modified May 13, 2004 VCU SKYLINE 1

Distribution Statement A (Approved for Public Release, Distribution Unlimited)

Rotary Wing DVE Solution Proof Of Concept Live Demonstration

Flight Testing Of Fused Reality Visual Simulation System

Microwave Remote Sensing (1)

Pilot Training with the Full Capability of an Integrated LVC Training System

Experimental aerial photogrammetry with professional non metric camera Canon EOS 5D

MULTIPLE SENSORS LENSLETS FOR SECURE DOCUMENT SCANNERS

GPS with RAIM or EGNOS? The difference for (mountainous) helicopter operations. Marc Troller Skyguide / CNS expert group

CHAPTER 7 CONCLUSIONS AND SCOPE OF FUTURE WORK

Applying Multisensor Information Fusion Technology to Develop an UAV Aircraft with Collision Avoidance Model

Collaborative Visualization in Augmented Reality

A Method for Quantifying the Benefits of Immersion Using the CAVE

Photographing Long Scenes with Multiviewpoint

Argonne National Laboratory P.O. Box 2528 Idaho Falls, ID

Virtual Reality Devices in C2 Systems

Image Characteristics and Their Effect on Driving Simulator Validity

Experiences in. Flight Inspecting GBAS

SMARTSCAN Smart Pushbroom Imaging System for Shaky Space Platforms

Mixed Reality technology applied research on railway sector

Model-Based Design for Sensor Systems

AIR ROUTE SURVEILLANCE 3D RADAR

USER-ORIENTED INTERACTIVE BUILDING DESIGN *

DEVELOPMENT OF PASSIVE SURVEILLANCE RADAR

DECISION No. 8/10 REVISION TWO OF DECISION NUMBER SEVENTEEN TO THE TREATY ON OPEN SKIES

Hyperspectral Imagery: A New Tool For Wetlands Monitoring/Analyses

VR-programming. Fish Tank VR. To drive enhanced virtual reality display setups like. Monitor-based systems Use i.e.

A HUMAN PERFORMANCE MODEL OF COMMERCIAL JETLINER TAXIING

A premium passenger car is controlled and managed by 80+ Embedded Systems. Communication systems for vehicle electronics

COLLABORATION WITH TANGIBLE AUGMENTED REALITY INTERFACES.

AN ORIENTATION EXPERIMENT USING AUDITORY ARTIFICIAL HORIZON

Evolution from 3D to 4D radar

IMPLEMENTATION OF GNSS BASED SERVICES

DEM GENERATION WITH WORLDVIEW-2 IMAGES

Transcription:

PROGRESS ON THE SIMULATOR AND EYE-TRACKER FOR ASSESSMENT OF PVFR ROUTES AND SNI OPERATIONS FOR ROTORCRAFT 1 Rudolph P. Darken, 1 Joseph A. Sullivan, and 2 Jeffrey Mulligan 1 Naval Postgraduate School, Monterey, CA, 2 NASA Ames Research Center, Moffet Field, CA Background: The concept of Precision Visual Flight Rules (PVFR) and Simultaneous Non-Interfering (SNI) Routes for rotorcraft is based on the hypothesis that rotorcraft with Global Positioning System (GPS) navigation capabilities can stay within narrow, defined horizontal airspace limits while operating under Visual Flight Rules (VFR). If the pilot maintains the aircraft within the confines of a PVFR route and if these routes can be designed to keep rotorcraft separated from fixed-wing traffic then PVFR routes offer rotorcraft the possibility of operating in congested airspace simultaneously with fixedwing aircraft on a non-interfering basis, hence the term SNI operation (Hickok & McConkey, 2003). INTRODUCTION suitable for use both in the actual rotorcraft and The objective of this research program is to in the simulator. investigate Precision VFR (PVFR) routes for This paper will report on progress in both of Simultaneous Non-Interfering (SNI) operations these parts of the research program. in the National Air Space (NAS). For a variety of reasons, it is not practical nor is it safe to investigate these routes exclusively using actual THE SIMULATOR in-flight rotorcraft. The research plan calls for a The actual simulator consists of both an series of steps which will facilitate this apparatus and related software and models. Our investigation. approach is to construct a simplified simulator 1. Conduct a human factors investigation of in- that replicates only those aspects of the piloting flight performance of pilots during a PVFR task that are relevant to this research program. route (This phase is described in detail in We have been conducting research on rotorcraft Hickok & McConkey, 2003). navigation and piloting for several years and have completed a cognitive task analysis and 2. Replicate the same task and environment several prototype simulators for the overland used in the in-flight study using a virtual navigation task (J. Sullivan, Darken, & McLean, simulation. Compare human factors data 1998; J. A. Sullivan, 1998). (visual scan patterns, performance, etc.) to determine if the simulation approximates actual flight and is therefore suitable for further investigation. The Apparatus The hardware apparatus for the SNI simulator 3. Assuming the simulator satisfies the consists of an internal environment (cockpit) to requirements set out in the first phases of the include seat, controls, and simulated gauges, and program, we can then construct new PVFR also a display to simulate the external routes and collect human performance data environment. to determine their feasibility and predicted improvement over current standards. The selected aircraft for the experiment is the Army OH-58A. The primary data for pilot The role of the Naval Postgraduate School is to performance are airborne digital data of actual construct the simulation environment for position and head and eye position during flight. experimentation in the later phases of the Tracking the position of the aircraft during program. NASA will construct a mobile eyesimulated flight is relatively trivial. However, it tracker for use in data collection. It must be is essential that the simulator replicate the 1

displays and gauges of the OH-58A in order to be able to compare head position and eye gaze data from the airborne portion of the program to the simulator portion. We currently have a placeholder LCD panel for simulated gauges but a plan is in place to replace this with a full scale panel from OH-58A specifications. We will still use LCD panels to drive the gauges but will use a cut-out in the panel with the LCD panel showing where needed. In Figure 1, we show the apparatus inside the projection screens. The LCD panel will be replaced shortly. Figure 2. The SNI simulator during use. The Virtual Environment The environment chosen for the experimentation plan is the region immediately surrounding Tullahoma, TN Regional Airport (THA). Satellite Technology Implementation (STI) designed the flight routes as described in Hickok & McConkey (2003). A portion of the flight route is shown in Figure 3. Figure 1. The SNI simulator apparatus. To simulate the external environment, we use a 3-screen wide field-of-view display. In the virtual environment literature, this is commonly referred to as a CAVE (Cruz-Neira, Sandin, & DeFanti, 1993). Our simulator can also use a Chromakey bluescreen mixing capability that uses a head-mounted display with a camera mounted on it (Lennerton, 2003). This apparatus is likely unusable for this application, however because of incompatibilities with the eyetracking device. In Figure 2, we show a pilot in the simulator during use. The screen behind the pilot is a surrogate for the frame of the aircraft to limit the visual field similarly to the actual airframe. Figure 3. A portion of the flight route (from Hickok & McConkey (2003). After the region was selected, we collected publicly available digital data of the area and began to construct the virtual model for use in the simulator. We have used Digital Terrain Elevation Data (DTED) and aerial imagery to compose the model shown in Figure 4. We have 2

approximated the locations of three of the waypoints also identified in Figure 3. Figure 4. The virtual model. view, but reflects an infrared image of the eye to a small camera mounted on the side of the frame. A second forward-looking camera records the subject's eye view of the scene, which is used to locate the subject's head within the experimental environment. Images from the cameras are tiled into a single video signal using a quad processor, and then are recorded for later analysis. The head mount may be directly connected to the recording unit ( tethered operation), or connected using a 2.4 GHz wireless link. Before recording, time code is added to the signal; the initial time for the time code generator can be derived from a GPS receiver, and GPS position information (sampled once per second) may also be recorded. The model is a work in progress. We have not yet implemented low-level features that will be needed for the PVFR route. The STI team will inform us of these features and their locations. We will then take photographs as needed and place these objects on the virtual model. The model is then imported into the simulation environment so that the virtual helicopter can fly through virtual Tullahoma. EYE-TRACKING The objective of the eye tracking portion of the project is to capture head pose and eye gaze position during flight, either simulated or actual. Actual flight is obviously the more difficult of the two due to size and power constraints. The data must be captured and time stamped for later evaluation. Ames Portable Eye-Tracker The purpose of this technology development project is to develop a lightweight, comfortable head-mounted eye tracking device suitable for extended use in operational environments. The basic design is patterned on a prototype developed at the Rochester Institute of Technology, and is based on a raquetball eye shield. A portion of the plastic lens is cut away and replaced with an adjustable hot mirror which allows the subject a clear straight-ahead Figure 5. Schematic diagram of the eye-tracking device. Head Pose Estimation While the head-mounted camera unit described above has the highest potential gaze-tracking accuracy, there may be situations in which it is impossible to use a head mount. Therefore, we are also exploring technologies to recover gaze using images from a remote, fixed camera. In such images, the subject's eye may only subtend a few pixels, and in this case there is insufficient data to determine eye gaze. However, in normal behavior, the eye rarely deviates more than 5 or 10 degrees from primary position (straight ahead in the head), and so the pose (orientation) of the head can be used to obtain a crude estimate of gaze. 3

This is not the primary approach we envision for the SNI program at UTSI but it is a suitable technology that may be useful either in the aircraft or the simulator. We will know if this is needed after preliminary testing concludes shortly. Figure 6. Head pose estimation. The problem of head-based gaze tracking can be divided into a few distinct components: 1. We must find the head in the image 2. Once we have located the head we must estimate the pose. When color is available, it is often possible to easily find the face using a color based segmentation (see Figure 6). For monochrome images we have also explored a templatematching algorithm. To estimate the pose parameters, we have investigated methods for directly mapping the image to parameter values. To do this, we start with a training set of images with known pose values; a principal components analysis is then used to extract the eigenfaces, a set of images which captures the variation between the images in the training set. By restricting analysis to the first few eigenfaces, a large reduction in dimensionality is achieved. Finally, we solve for a polynomial function of the eigenface coefficients that comes closest to predicting the pose parameters in the training set. We have achieved accuracy of a degree or two using a synthetic training set. Current work is in progress to develop a training set based on real images, where the pose parameters used for training are taken from a 3-D model of the test subject's head. Head-Based Gaze Tracking Once we have determined the gaze direction (be it of the eye or head), we still need to determine the location or object in the environment that is the target of gaze. To do this automatically, it is necessary to develop a model of the environment. Figure 7 shows an example taken from the interior of a control tower simulator, but the principles are the same for a cockpit or any other environment. We begin by constructing a model of the environment; in this case we used architectural drawings and on-site measurements to construct the model. Next, we determine the position of the virtual camera for which the rendered model is aligned with the video data. Once that has been accomplished, we grab the image texture from the video and paste it onto the surfaces of the model. We can then re-render the model from novel viewpoints, including the subject's eye view. After the position of the subject's head has been determined, the estimated head pose can be used to cast a gaze vector; the intersection of this gaze vector with a model surface yields the point of regard in the scene. Figure 7. Head-based gaze tracking 4

The procedures for automatic determination of the camera position will also be used for video based head-tracking with the portable eyetracker. The first prototype device for the experiment has been designed and is currently undergoing testing at NASA Ames. On July 2nd the first outdoor test was made, and the need for some additional optical baffling was discovered. Construction of the second goggle is nearly complete. A different battery has been selected for the power source and the power supply regulator issues have been solved. Most of the remaining outstanding issues will not be clearly defined until the July integration trip to UTSI. SCHEDULE Key schedule milestones are shown below: Test Readiness Demonstration: 09-30-03 Data Collection Complete: 10-31-03 Simulator Complete: 12-31-03 Simulator Test Plan Draft: 03-01-04 REFERENCES Cruz-Neira, C., Sandin, D., & DeFanti, T. (1993). Surround-Screen Projection-Based Virtual Reality: The Design and Implementation of the CAVE. Computer Graphics, 135-142. Hickok, S.M. and McConkey, E.D. (2003) Flight Test Plan to Assess PVFR Routes and SNI Operations for Rotorcraft Lennerton, M. (2003). Exploring a Chromakeyed Augmented Virtual Environment as an Embedded Training System for Military Helicopters. Unpublished Masters, Naval Postgraduate School, Monterey, CA. Sullivan, J., Darken, R., & McLean, T. (1998, June 2-3, 1998). Terrain Navigation Training for Helicopter Pilots Using a Virtual Environment. Paper presented at the Third Annual Symposium on Situational Awareness in the Tactical Air Environment, Piney Point, MD. Sullivan, J. A. (1998). Helicopter Terrain Navigation Training Using a Wide Field of View Desktop Virtual Environment. Unpublished Masters thesis, Naval Postgraduate School, Monterey, CA. SUPPORTING DOCUMENTATION Documents and technical direction related to this test include: 1. FAA Aeronautical Information Manual (AIM). 2. FAA Order 7110.65N; Air Traffic Control 5