Available online at ScienceDirect. Procedia Manufacturing 3 (2015 )

Similar documents
Journal of Unmanned Vehicle Systems ASSESSMENT OF ALTERNATIVE MANUAL CONTROL METHODS FOR SMALL UNMANNED AERIAL VEHICLES

Heterogeneous Control of Small Size Unmanned Aerial Vehicles

Available online at ScienceDirect. Procedia Manufacturing 3 (2015 )

Classical Control Based Autopilot Design Using PC/104

Airspeed Indicator for R/C Airplane. Brandon Richards Senior Project

FlyRealHUDs Very Brief Helo User s Manual

Air Surveillance Drones. ENSC 305/440 Capstone Project Spring 2014

2009 Student UAS Competition. Abstract:

SkyView. Autopilot In-Flight Tuning Guide. This product is not approved for installation in type certificated aircraft

OughtToPilot. Project Report of Submission PC128 to 2008 Propeller Design Contest. Jason Edelberg

AIRCRAFT AVIONIC SYSTEMS

FLCS V2.1. AHRS, Autopilot, Gyro Stabilized Gimbals Control, Ground Control Station

A Telemetry Antenna System for Unmanned Air Vehicles

Post-Installation Checkout All GRT EFIS Models

EE Chapter 14 Communication and Navigation Systems

Digiflight II SERIES AUTOPILOTS

Hardware Modeling and Machining for UAV- Based Wideband Radar

Operating Handbook For FD PILOT SERIES AUTOPILOTS

3D Animation of Recorded Flight Data

The Detect & Avoid Requirements and Technologies for small RPAS

Aerial Image Acquisition and Processing Services. Ron Coutts, M.Sc., P.Eng. RemTech, October 15, 2014

ClearVision Complete HUD and EFVS Solution

Flight Testing Of Fused Reality Visual Simulation System

Pitlab & Zbig FPV System Version 2.60a. Pitlab&Zbig OSD. New functions and changes in v2.60. New functions and changes since version 2.

Digiflight II SERIES AUTOPILOTS

Aerial Firefighting Europe SEILAF: Wildfirexperience

2007 AUVSI Competition Paper Near Space Unmanned Aerial Vehicle (NSUAV) Of

Helicopter Aerial Laser Ranging

PROGRESS ON THE SIMULATOR AND EYE-TRACKER FOR ASSESSMENT OF PVFR ROUTES AND SNI OPERATIONS FOR ROTORCRAFT

Project Number: 13231

A3 Pro INSTRUCTION MANUAL. Oct 25, 2017 Revision IMPORTANT NOTES

Appendix E. Gulf Air Flight GF-072 Perceptual Study 23 AUGUST 2000 Gulf Air Airbus A (A40-EK) NIGHT LANDING

FAA APPROVED AIRPLANE FLIGHT MANUAL SUPPLEMENT FOR. Trio Pro Pilot Autopilot

INTELLIGENT LANDING TECHNIQUE USING ULTRASONIC SENSOR FOR MAV APPLICATIONS

Pro Flight Trainer Accuracy Flight Test Test-Pilot s guide Revision 2

MITIGATING PILOT DISORIENTATION WITH SYNTHETIC VISION DISPLAYS. Kathryn Ballard Trey Arthur Kyle Ellis Renee Lake Stephanie Nicholas Lance Prinzel

Formation Flight CS 229 Project: Final Report

D-0006 BOM (Broadcasting Outer Module) Installation Instructions LEVIL AVIATION 1704 KENNEDY POINT, SUITE 1124 OVIEDO, FL 32765

Stratollites set to provide persistent-image capability

Stratomaster Enigma. Preliminary installation documentation

1 P a g e. P13231 UAV Test Bed Setup Manual

The Active Flutter Suppression (AFS) Technology Evaluation Project

FULL MISSION REHEARSAL & SIMULATION SOLUTIONS

U.S. Census Bureau Defense, Navigational and Aerospace Electronics MA334D(07) Issued June 2008

Detrum MSR66A Receiver

IPRO 312: Unmanned Aerial Systems

Operating Handbook. For. Gemini Autopilot

DESIGN OF TUNNEL-IN-THE-SKY DISPLAY AND CURVED TRAJECTORY

FROM IN-FLIGHT SIMULATORS TO UAV SURROGATES

EzOSD Manual. Overview & Operating Instructions Preliminary. April ImmersionRC EzOSD Manual 1

Product Introduction:

SMART BIRD TEAM UAS JOURNAL PAPER

This page is intentionally blank. GARMIN G1000 SYNTHETIC VISION AND PATHWAYS OPTION Rev 1 Page 2 of 27

Microsoft ESP Developer profile white paper

PARROT SKYCONTROLLER 2 PARROT COCKPITGLASSES 2 2 POWER BATTERIES

Small Airplane Approach for Enhancing Safety Through Technology. Federal Aviation Administration

Experimental Cooperative Control of Fixed-Wing Unmanned Aerial Vehicles

Development of a Fixed-Wing Autonomous Aerial Vehicle at Virginia Tech

Framework and the Live, Virtual, and Constructive Continuum. Paul Lawrence Hamilton Director, Modeling and Simulation

Development of Hybrid Flight Simulator with Multi Degree-of-Freedom Robot

Operating Handbook. For. Gemini Autopilot

New functions and changes summary

UAV PHOTOGRAMMETRY COMPARED TO TRADITIONAL RTK GPS SURVEYING

U.S. Census Bureau Defense, Navigational and Aerospace Electronics MA334D(10) Issued June 2011

North Carolina State University. Aerial Robotics Club. Autonomous Reconnaissance System

A New Perspective to Altitude Acquire-and- Hold for Fixed Wing UAVs

Integration of a Miniature Synthetic Aperture Radar (MicroSAR) on the Aerosonde UAV

THE DEVELOPMENT OF A LOW-COST NAVIGATION SYSTEM USING GPS/RDS TECHNOLOGY

FOXTECH Nimbus VTOL. User Manual V1.1

AG-VA Fully Autonomous UAV Sprayers

Teaching Psychology in a $15 million Virtual Reality Environment

TEAM AERO-I TEAM AERO-I JOURNAL PAPER DELHI TECHNOLOGICAL UNIVERSITY Journal paper for IARC 2014

Sikorsky S-70i BLACK HAWK Training

Design of FBW Flight Control Systems for Modern Combat Aircraft Shyam Chetty Former Director, CSIR-NAL Bangalore

ENSTROM 480B OPERATOR S MANUAL AND FAA APPROVED ROTORCRAFT FLIGHT MANUAL SUPPLEMENT GARMIN GTN 650 NAVIGATION SYSTEM

Simulator Technology in Optimising the Human-Automated System Interface

Electroluminescent Lighting Applications

Jager UAVs to Locate GPS Interference

GRT Autopilot User Guide. All GRT EFIS Systems

HALS-H1 Ground Surveillance & Targeting Helicopter

Fabrication of the kinect remote-controlled cars and planning of the motion interaction courses

The drone for precision agriculture

Recent Progress in the Development of On-Board Electronics for Micro Air Vehicles

Drones and Ham Radio. Bob Schatzman KD9AAD

OPTICAL CAMOUFLAGE. ¾ B.Tech E.C.E Shri Vishnu engineering college for women. Abstract

HELISIM SIMULATION CREATE. SET. HOVER

AE4-393: Avionics Exam Solutions

WE SPECIALIZE IN MILITARY PNT Research Education Engineering

DLR Project ADVISE-PRO Advanced Visual System for Situation Awareness Enhancement Prototype Introduction The Project ADVISE-PRO

Applying Multisensor Information Fusion Technology to Develop an UAV Aircraft with Collision Avoidance Model

NAVIGATION INSTRUMENTS - BASICS

Guilin Feiyu Electronic Technology Co., Ltd. Guilin FeiYu Electronic Technology Co.

Electrical connection

North Carolina State University Aerial Robotics Club

AN INSTRUMENTED FLIGHT TEST OF FLAPPING MICRO AIR VEHICLES USING A TRACKING SYSTEM

APPENDIX C VISUAL AND NAVIGATIONAL AIDS

EVALUATING VISUALIZATION MODES FOR CLOSELY-SPACED PARALLEL APPROACHES

An Introduction to Airline Communication Types

Development of an Autonomous Aerial Reconnaissance System

Organizational Culture in Aviation Fire Suppression

Transcription:

Available online at www.sciencedirect.com ScienceDirect Procedia Manufacturing 3 (2015 ) 952 959 6th International Conference on Applied Human Factors and Ergonomics (AHFE 2015) and the Affiliated Conferences, AHFE 2015 Assessment of alternative manual control methods for small unmanned aerial vehicles Jonathan D. Stevenson*, Siu O Young, Luc Rolland Memorial University, St.John s, NL, A1C5S7, Canada Abstract This paper is a summary of experiments to access alternative methods to control a small (<25 kg) Unmanned Aerial Vehicle (UAV) in manual mode. While it is true that the majority of the UAV airborne activities will be in autonomous mode (i.e. using an autopilot) this may not always be the case during takeoffs and landings, or if there is a failure of the autopilot. The requirement for a manual control mode for all flight stages remains in proposed UAV regulations currently being defined both in Canada and the U.S. The question then becomes If we have to take manual control of the UAV, what is the best method? The research summarized in this paper is an attempt this question. 2015 The Authors.Published by by Elsevier B.V. B.V. This is an open access article under the CC BY-NC-ND license Peer-review (http://creativecommons.org/licenses/by-nc-nd/4.0/). under responsibility of AHFE Conference. Peer-review under responsibility of AHFE Conference Keywords:UAV; Unmanned aerial vehicle; Virtual reality; First person view; Manual control * Corresponding author. Tel.: +0-514-465-0777 E-mail address: jstevenson@mun.ca Under Transport Canada rules small UAV are classified as under 25 kg [1]. 2351-9789 2015 The Authors. Published by Elsevier B.V. This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/). Peer-review under responsibility of AHFE Conference doi:10.1016/j.promfg.2015.07.144

Jonathan D. Stevenson et al. / Procedia Manufacturing 3 ( 2015 ) 952 959 953 1. Introduction Current UAVs are controlled in either autonomous or manual control modes depending on the type of UAV and the mission phase. The most common control modes are: Autonomous Control: 1. Autopilot control, usually using GPS waypoints to define a flight plan; 2. Inertial, airspeed and pressure sensors used for inner-loop airframe control; and, 3. Automatic Take-off and Landing (ATOL) capabilities offered by some autopilots. Manual Control: 1. Radio Control (RC) Aircraft methods by an External Pilot (EP) using a un-aidedthird person remote view of the UAV very common with small UAVs; 2. A Flight Console, similar to a cockpit, using a forward fixed camera view to allow an EP to fly the UAV as in a simulator very common with military UAVs like the Predator; and, 3. Virtual Reality (VR) methods employing various forms of First Person View (FPV) flying, including head tracking techniques increasingly popular in the RC hobby. While it is true that some autopilots provide an ATOL capability, manual control is still more common during landing and take-off, including with the larger military UAVs [2]. This is in spite of the fact that the majority of UAV accidents occur during take-off and landing, especially by UAVs which rely on a manual pilot to accomplish these tasks [3]. Even if a fully-autonomous capability (i.e. including ATOL) were provided there will still be a requirement for a manual external pilot (EP) according to current and proposed UAV regulations. The EP is expected to be ready to assume manual control as needed, especially if there is a failure of the autopilot, in current Canadian regulations [4]. Similar regulations are expected in both the U.S. and Europe. Nomenclature AP Autopilot FS Flight Simulator ATOL Automatic Takeoff and Landing HUD Heads-Up Display AVO Air Vehicle Operator LCD Liquid Crystal Display BLOS Beyond Line of Sight OSD On-Screen Display (another name for a HUD) DOE Design of Experiments RC Radio Control EP External Pilot, Also Manual Pilot (MP) UAV Unmanned Aerial Vehicle FPV First Person Video VR Virtual Reality 2. The hypothesis It is proposed that switching to a FPV akin to being inside a virtual cockpit should improve the controllability of small UAVs, especially in adverse weather conditions such as high winds. The use of such Virtual Reality (VR) techniques has been investigated before [5] and is a common technique used in the larger military UAVs such as the Predator-B. Switching to FPV methods should eliminate most of the disorientation problems associated with RC control[6]. This may also reduce the training/personnel problem which exists for many small UAVs which require a highly skilled RC pilot to fly them. 3. Landing experiments to assess manual control accuracy A series of experiments have been conducted to assess the accuracy of alternative methods of manual control. The experiment was conducted twice. The first experiment was conducted in August 2010 using equipment based on the X11 FPV setfrom RC-Tech [7]. The results were inconclusive, as presented at UVS Canada 2010 [8]. It was

954 Jonathan D. Stevenson et al. / Procedia Manufacturing 3 ( 2015 ) 952 959 decided that the experiment would be repeated at a later date with an improved test site and better equipment. This paper will focus on the results of the updated experiment conducted in November 2013. 3.1. Landing experiment design The experiment focused on the task of landing a small UAV, as illustrated in Figure 1.The primary task is to land the UAV on the centerline of the runway, preferably just past the near end of the runway to allow a safe post-landing deceleration. The post-landing track should be straight down the center of the runway, to avoid violent lateral accelerations or sideways runway excursions. The experiment used a Design of Experiments (DOE) factorial approach. This allows the use of a sparse matrix of test factor combinations while preserving the statistical validity of the results. This is useful when conducting real-world tests where running a full statistical set of all possible combinations would be too expensive, dangerous or impractical. The objective in the DOE method is to determine those factors which are the most significant to the response variables being studied. By using a randomized order of test runs, the effect of random factors beyond the researchers control is minimized and appears as noise in the statistical analysis [9]. 3.2. Independent variables Three factors were varied in this experiment as detailed in the following sections. 3.2.1. Factor A - Form of manual piloting Three different forms of manual control were assessed: Radio Control (RC) Mode - The UAV is flown using un-aided external third-person view of the aircraft. Flight Simulation (FS) Mode - A fixed forward camera view along the aircraft centerline, providing a First- Person View (FPV)similar to a Flight Simulatorcockpit display. Immersive (VR) Mode An immersive view using VR goggles, projecting a FPV binocular video image on a set of tiny LCDs directly in front of both eyes of the pilot. The goggles also featured 2-axis tilt sensors inside the forehead of the goggle housing, providing a head-tracking ability. This allowed the VR pilots to turn his head and pan/tilt the camera on the aircraft, giving the illusion of being in the cockpit of the aircraft. 3.2.2. Factor B - Skill level of pilot Two pilots of very different experience level were recruited, to assess the impact that novel manual control methods might have on veterans or novice flyers. For the 2013 experiment they were: Stephen Crewe (Veteran Pilot) had 10 years flying experience flying RC aircraft, including one year using FPV equipment with his own personal R/C flying. DilhanBalage (Rookie Pilot) about 1 year experience, most of this experience on the smaller Ultra40 and electric foamy type aircraft. Fig.1. Diagram of landing task for VR experiment.

Jonathan D. Stevenson et al. / Procedia Manufacturing 3 ( 2015 ) 952 959 955 3.2.3. Factor C - Wind conditions To assess the effect of weather, especially wind, two different wind levels were used, defined as follows: Calm (<10 km/hr) Windy (>30 km/hr) 3.3. Response variables To assess the quality of each landing, the following response variables were used: X = Touchdown point along length of runway. Y = Maximum deviation from runway centerline. During the experiment it was noted this was generally the touch-down point, as both pilots quickly corrected to the centerline once landed. 3.4. Test setup The 2013 experiment used FPV equipment components based on the EagleTree series of products. Aimed at the high-end RC hobby community, these FPV components are now borderline UAV avionics sets, including built-in Head-tracking, On-Screen Displays (OSD), and a rudimentary autopilot capability[10]. For these experiments only the Head-Tracking and OSD features were used. 3.4.1. Airborne equipment The airborne FPV equipment is based on a two-axis turret with an integrated camera with NTSC resolution. Tilt, GPS and electrical propulsion system sensors are used by the OSD module to generate a combined video signal which includes display elements similar to a Heads-Up Display (HUD). The video signal is transmitted to the ground receive station using a 5.8 GHz transmitter of 200 mw power. The camera turret and electronics were integrated into two small boxes and mounted on top of the main wing of the flight vehicle, a Great Planes GiantStik[11]tail number 10 (i.e. GBS#10), as shown in Figure 1(a). A Close-up of the FPV payload may be seen in Figure 1(b). Electric propulsion was used to reduce vibration and noise levels. a b Fig. 2.(a) GiantStiktest vehicle; (b) Close-up of FPV components.

956 Jonathan D. Stevenson et al. / Procedia Manufacturing 3 ( 2015 ) 952 959 a b Fig. 2. (a) VR mode goggles; (b) FS mode analog LCD display. 3.4.2. Ground components 1. VR Mode Goggles - The experiment used a set of VR goggles from Fat Shark[12]. These goggles feature a builtin 5.8 Ghz receiver and tilt sensors for 2-axis head tracking. Output from the tilt sensors are fed back through the training port of the RC transmitter as shown in Figure 3(a), and drives the pair of small servos in the camera turret using two un-used auxiliary channels of the RC receiver. 2. FlightSim (FS) Mode Display The 2013 experiment used a dedicated analog LCD for the FS mode. This was mounted on a tripod, as demonstrated in Figure 3(b). The use of a dedicated FPV display unit improved the ergonomics of the FS mode versus the method used in 2010. The analog display also improved safety, as it continued to show degraded videoeven when the signal weakened, not a blank blue screen as with most digital LCDs. Note that the VR goggle displaysalso behaved in this manner. 3. HUD Display The FPV video image included a superimposed HUD display. Figure 4 is an example of the display provided to the pilots. The HUD provided the current aircraft position, ground speed (MPH) and altitude above ground (ft) in the form of ladders on the left and right sides respectively, as well as an artificial horizon. The items in the lower left and right corners provided the current power draw and total power consumed by the electrical propulsion system, thus serving as a fuel gauge during each flight. Fig. 4.Sample FPV display with HUD.

Jonathan D. Stevenson et al. / Procedia Manufacturing 3 ( 2015 ) 952 959 957 4. Experiment results The experiment was ran on Friday, November 29, 2013. The winds were down the runway (i.e. almost no cross wind) at 5-7 knots (about 9-13 km/hr). Conditions were sunny with a few clouds but quite cold (~2 degc). The test cases ran in the morning were deemed to be Calm Day conditions. The results showed a marked improvement in overall pilot performance versus the 2010 results. The effect of running the experiment on a proper runway, with aviation-grade markings was clear. Both pilots reported it was very easy to see the runway threshold, centerline and the numbers from a long distance, allowing for improved line-ups and precision landings. Similar to the 2010 results, the 2013 results show a gradual improvement as the tests were conducted, indicating a training factor may be involved. The one disappointment was in the VR Goggle cases by the rookie pilot. He expressed some discomfort with the VR mode. This led to loss of orientation on several occasions, resulting in some erratic flying and difficulty in performing a safe approach. In both VR landing attempts, the situation deteriorated to point were safety for the ground crew was a concern, and the other pilot had to take control. Following the morning tests, and facing deteriorating team morale and weather, it was decided to perform a series of demonstration runs. The author stepped in to be the rookie pilot. Only four cases were ran before weather conditions forced a stop to the experiment. Both pilots expressed optimism about the quality of the video feed, and also the ease of flying well coordinated approaches. The VR method was not disorienting for either of us. The accuracy of the approaches, especially in terms of centerline deviation, is pretty clear. One of my approaches went quite long, similar to full-sized aircraft floating behavior, but I was able to stick the landing essentially on the centerline of the runway. Weather conditions were rapidly deteriorating at this point, as it started to snow, forcing a stop to the experiment. This would turn out to be the final flying of Project Raven. Fig.3.Landing positions, 2013 VR experiment.

958 Jonathan D. Stevenson et al. / Procedia Manufacturing 3 ( 2015 ) 952 959 5. Analysis of theresults 5.1.1. Landing accuracy The results of the 2013 VR experiment are summarized in Figure 5. When compared with the results of 2010, several observations can be made. First, the accuracy of the landings have improved. The centerline distance, apart from two of the FS mode cases, are quite good. Landing distance is very good too, but generally long when the FS mode is used. The accuracy of the VR mode landings by the veteran pilot and the author were very good. The ease of lining up a good landing was greatly increased when the immersive 3D mode was used. But the failed approaches of the rookie pilot in VR mode was a disappointment. In post-test discussions it became evident that while the VR mode assisted myself and the veteran pilot, both who have flown flight simulators extensively since adolescence, to a person untrained in manned aircraft flight the VR mode does not appear to provide the same advantage. The rookie pilot first learned to fly by RC mode only, and this appears to be the method he is most comfortable with. This was a result those of us involved in aviation had not considered. Learning to fly RC aircraft normally requires combatting the third person disorientation effects caused by control reversal. The FPV mode is one way of eliminating this problem, but only to someone already familiar with manned aviation. 5.1.2. Flying accuracy and quality Following the 2013 experiment, the video footage from both series of experiments were re-examined. The measurements of landing distance (X) and centerline deviation (Y) do not tell the complete story. It was noted that when flying in RC mode, the aircraft turns and altitude holding were not as accurate or smooth as with the FPV modes. When flying FPV, the pilot appears to naturally start to fly like a full-size aircraft. Turns are gentler with bank angles more typical of manned aircraft. Altitude, especially with the addition of HUD instruments, is held within 50 feet. The size of the circuits are also much bigger in FPV mode, with the pilot flying downrange quite some distance before turning onto final approach. This last observation has significant repercussions as elaborated on in the next section. 5.1.3. Significance of the loss of GSB#11 A second GiantStik (GBS#11)was lost off-shore during initial FPV equipment shakedown tests at he abandoned U.S. naval air station at Argentia, NL. While not a planned event, is an important result of these VR experiments. It was flown beyond the range of the FPV video transmitter resulting in a loss of video signal, and was too far away to allow recovery using normal RC means. This illustrates one of the dangers of using FPV mode pilots tend to fly much higher and further, possibly in an attempt to emulate full-size aircraft flying practice, to the point where FPV becomes the only viable means of manual remote control. The gigantic nature of the runways at Argentia (i.e. 200 foot wide runways) only served to exacerbate this tendency. Without a reliable Beyond Line-of-Sight (BLOS) manual control link, the use of RC methods alone at the extended ranges encouraged by the use of FPV may render the small UAV uncontrollable. It is for this reason that both the MAAC and AMA have set strict guidelines on the use of FPV techniques. In general, FPV can only be used within unassisted visual range and always with the presence of dedicated spotter who is also a second pilot ready to assume normal RC control should the FPV pilot become disorientated or equipment fail [13]. Transport Canada also cautions the UAV operator to not assume that current FPV technology alone can provide the desired situational awareness at BLOS ranges, in terms of providing the sense and avoid function of a manned aircraft pilot [14].

Jonathan D. Stevenson et al. / Procedia Manufacturing 3 ( 2015 ) 952 959 959 6. Conclusions When the results from both VR experiments are considered together there are a number of important observations and conclusions that may be drawn: 1. The FPV modes improve landing accuracy, assuming the use of a runwayusing aviation grade markings. 2. For someone entrenched in the use of RC modes, the addition of FPVmay hinder landing accuracy. 3. The FPV modes appear to be a promising methods, but only if used by pilots familiar with full-size aircraft flying methods, and with sufficient training. 4. The fixed-view FS mode, while being easier to implement (i.e. not requiring the head tracking equipment) appears to be less accurate then the immersive VR mode. 5. The immersive VR mode gives a great field of view and improved situational awareness versus the FS mode. 6. The addition of a HUD improves the accuracy and quality of FPV flight modes. 7. When flying using FPV, pilots have a tendency to fly higher and further than normal RC flying. It therefore becomes crucial that sufficient video feed signal strength (i.e. range) is provided to avoid sudden video signal drop-outs. 8. FPV methods are limited to the limits of electronic line of sight, namely the range of the video link. 9. Reliable manual control at BLOS ranges will require some form of augmented piloting console. One possibility is to use a hybrid display combining an extended range video with a synthetic environment similar to a flight simulator. This remains an active area of research and development within this research team. Acknowledgements This research was supported financially through the National Sciences and Research Council of Canada (NSERC) and the Atlantic Canada Opportunity Agency (ACOA). References [1] Transport Canada, Knowledge Requirements for Pilots of Unmanned Air Vehicle Systems (UAV) 25 kg or Less Operating within Visual Line of Sight, Document TP15263, 2014. [2]ASDNews, Predator B Demos Automatic Takeoff and Landing Capability, Aerospace and Defense News, Sept 18, 2012. [3] Williams, K., A Summary of Unmanned Aircraft Accident/Incident Data: Human Factors Implications, Federal Aviation Administration, Oklahoma City, 2004. [4] Transport Canada, Review and processing of an Application for a Special Flight Operations Certificate for the Operation of an Unmanned Air Vehicle (UAV) System, Staff Instruction SI-623-001, November 19, 2014, Appendix I. [5] G.S. Smith, G.J.Smith, G. J., A Virtual Reality Flight Trainer for the UAV Remote Pilot, Proceedings of International Society for Optical Engineering, Vol. 4021 (2000) pp. 224-233. [6] Williams, K., A Summary of Unmanned Aircraft Accident/Incident Data: Human Factors Implications, Federal Aviation Administration, Oklahoma City, 2004. p.4. [7] RC-Tech, X11 FPV Set, RC-Tech, Switzerland, 2010. [8] J. Stevenson, Assessment of UAV Manual Pilot Techniques using a Design of Experiment Approach, UVS Canada 2010, Montreal, 2010. [9] D.C. Montgomery, Design and Analysis of Experiments (5th ed.), Wiley & Sons, New York, 2001. [10] Eagle Tree Systems, Guardian and OSD FPV, Eagle Tree Systems, LLC, Bellevue, WA, 2013. [11] Great Planes, Giant Big Stik ARF Instruction Manual, Document GPMZ0197 for GPMA1224 V1.0, Great Planes Inc., Champaign, IL, USA, 2005. [12] Fat Shark, Attitude V2 FPV Goggle with Trinity Head Tracking, User Manual Revision B, Fat Shark RC Vision Systems, 2013 [13] MAAC, FPV Guidelines,Model Aeronautics Association of Canada (MAAC), March, 2012. [14]Transport Canada, Review and processing of an Application for a Special Flight Operations Certificate for the Operation of an Unmanned Air Vehicle (UAV) System, Staff Instruction SI-623-001, November 19, 2014, Section 6.2 (9).