accuracy. You even hear the terms subcentimeter or even millimeter absolute accuracy during some of these

Similar documents
AIRPORT MAPPING JUNE 2016 EXPLORING UAS EFFECTIVENESS GEOSPATIAL SLAM TECHNOLOGY FEMA S ROMANCE WITH LIDAR VOLUME 6 ISSUE 4

Five Sensors, One Day: Unmanned vs. Manned Logistics and Accuracy

Baldwin and Mobile Counties, AL Orthoimagery Project Report. Submitted: March 23, 2016

Ground Control Configuration Analysis for Small Area UAV Imagery Based Mapping

Validation of the QuestUAV PPK System

White Paper Reaching 1 cm (0.4 in) drone survey accuracy

UAV PHOTOGRAMMETRY COMPARED TO TRADITIONAL RTK GPS SURVEYING

White Paper Reaching 1 cm (0.4 in) drone survey accuracy

2019 NYSAPLS Conf> Fundamentals of Photogrammetry for Land Surveyors

PHOTOGRAMMETRIC RESECTION DIFFERENCES BASED ON LABORATORY vs. OPERATIONAL CALIBRATIONS

White Paper Reaching 1 cm (0.4 in) drone survey accuracy

Overview. Objectives. The ultimate goal is to compare the performance that different equipment offers us in a photogrammetric flight.

ACCURACY ASSESSMENT OF DIRECT GEOREFERENCING FOR PHOTOGRAMMETRIC APPLICATIONS ON SMALL UNMANNED AERIAL PLATFORMS

UAS Photogrammetry Best Practices

OLC Turnbull. wsidata.com

EnsoMOSAIC Aerial mapping tools

Sample Copy. Not For Distribution.

GEO 428: DEMs from GPS, Imagery, & Lidar Tuesday, September 11

NEWS FROM THE ULTRACAM CAMERA LINE-UP INTRODUCTION

Project Planning and Cost Estimating

VisionMap Sensors and Processing Roadmap

VisionMap A3 Edge A Single Camera for Multiple Solutions

PHOTOGRAMMETRY STEREOSCOPY FLIGHT PLANNING PHOTOGRAMMETRIC DEFINITIONS GROUND CONTROL INTRODUCTION

Phase One ixu-rs1000 Accuracy Assessment Report Yu. Raizman, PhaseOne.Industrial, Israel

OLC West Metro. wsidata.com

Innovative Imaging. Feature Articles Q Client Stories Q Company Profiles Q Aerial Mapping Companies Directory

Aerial efficiency, photogrammetric accuracy

MINNESOTA DEPARTMENT OF TRANSPORTATION OFFICE OF LAND MANAGEMENT SURVEYING AND MAPPING SECTION PHOTOGRAMMETRY UNIT

Planet Labs Inc 2017 Page 2

MEDIUM FORMAT CAMERA EVALUATION BASED ON THE LATEST PHASE ONE TECHNOLOGY

USE OF IMPROVISED REMOTELY SENSED DATA FROM UAV FOR GIS AND MAPPING, A CASE STUDY OF GOMA CITY, DR CONGO

DIGITAL AERIAL SENSOR TYPE CERTIFICATION

VERIFICATION OF POTENCY OF AERIAL DIGITAL OBLIQUE CAMERAS FOR AERIAL PHOTOGRAMMETRY IN JAPAN

HIGH RESOLUTION COLOR IMAGERY FOR ORTHOMAPS AND REMOTE SENSING. Author: Peter Fricker Director Product Management Image Sensors

ASPECTS OF DEM GENERATION FROM UAS IMAGERY

Lesson 4: Photogrammetry

Calibration Certificate

Deliverable 5-B: Review and Update on AURA System Requirements, Sensors, and Platforms Supplemental Report

UltraCam and UltraMap An Update

CALIBRATING THE NEW ULTRACAM OSPREY OBLIQUE AERIAL SENSOR Michael Gruber, Wolfgang Walcher

How to. Go Smoothly. Make Your Next Project. Oblique color aerial photograph from a fl yover of the Island of Mayaguana in the Bahamas

AERIAL SURVEY TEST PROJECT WITH DJI PHANTOM 3 QUADROCOPTER DRONE

NREM 345 Week 2, Material covered this week contributes to the accomplishment of the following course goal:

Recurring Aerial Imagery Acquisition Program

LiDAR Remote Sensing Data Collection Department of Geology and Mineral Industries Mt. Shasta Study Area February 2, 2011

Introduction to Photogrammetry

Camera Calibration Certificate No: DMC IIe

Experimental aerial photogrammetry with professional non metric camera Canon EOS 5D

Leica - 3 rd Generation Airborne Digital Sensors Features / Benefits for Remote Sensing & Environmental Applications

REQUEST FOR PROPOSAL AERIAL PHOTOGRAPHY & DIGITAL MAPPING ROADS DEPARTMENT

Phase One 190MP Aerial System

IGI Ltd. Serving the Aerial Survey Industry for more than 20 Years

Lecture 7. Leica ADS 80 Camera System and Imagery. Ontario ADS 80 FRI Imagery. NRMT 2270, Photogrammetry/Remote Sensing

Section E NSPS MODEL STANDARDS FOR TOPOGRAPHIC SURVEYS Approved 3/12/02

UltraCam and UltraMap Towards All in One Solution by Photogrammetry

The survey-grade mapping drone

MSB Imagery Program FAQ v1

Camera Calibration Certificate No: DMC II

DEM Generation Using a Digital Large Format Frame Camera

Camera Calibration Certificate No: DMC II

Introduction to Remote Sensing Lab 6 Dr. Hurtado Wed., Nov. 28, 2018

Camera Calibration Certificate No: DMC II

Camera Calibration Certificate No: DMC II

SDSFIE Raster (SDSFIE-R)

Orthoimagery Standards. Chatham County, Georgia. Jason Lee and Noel Perkins

Camera Calibration Certificate No: DMC II Aero Photo Europe Investigation

Update on UltraCam and UltraMap technology

PLANET IMAGERY PRODUCT SPECIFICATIONS PLANET.COM

RPAS Photogrammetric Mapping Workflow and Accuracy

Camera Calibration Certificate No: DMC II

Metro Area Planning Agency. Request for Proposal Omaha-Lincoln Metro Area Imagery Project

Combining Technologies: LiDaR, High Resolution Digital Images, Infrared Thermography and Geographic Information Systems

SENSITIVITY ANALYSIS OF UAV-PHOTOGRAMMETRY FOR CREATING DIGITAL ELEVATION MODELS (DEM)

E m e r g e n c y M a n a g e m e n t S e r v i c e. C o p e r n i c u s A e r i a l c o m p o n e n t s t a t u s s t u d y

Suveying Lectures for CE 498

Lab #10 Digital Orthophoto Creation (Using Leica Photogrammetry Suite)


RapidEye Initial findings of Geometric Image Quality Analysis. Joanna Krystyna Nowak Da Costa

Evaluation of Highway Mapping, Location and Design: Light Detection and Ranging (LDR)

2017 TSPS Annual Conference & Tech Expo Unmanned Aircraft Systems (UAS) As a Tool for Land Surveyors

Processing of stereo scanner: from stereo plotter to pixel factory

Helicopter Aerial Laser Ranging

Specifications for Aerial Photography and Photogrammetric Services 2013

22/11/2013. UAV: Overview of systems, applications and processing Kris Nackaerts, Peter Strigencz

Camera Calibration Certificate No: DMC III 27542

Chapter 1 Overview of imaging GIS

The Airphoto Ortho Suite is an add-on to Geomatica. It requires Geomatica Core or Geomatica Prime as a pre-requisite.

Leica ADS80 - Digital Airborne Imaging Solution NAIP, Salt Lake City 4 December 2008

** KEYSTONE AERIAL SURVEYS R. David Day, Wesley Weaver **

Vexcel Imaging GmbH Innovating in Photogrammetry: UltraCamXp, UltraCamLp and UltraMap

Active and Passive Microwave Remote Sensing

Geo-localization and Mosaicing System (GEMS): Enabling Precision Image Feature Location and Rapid Mosaicing General:

ENVI Tutorial: Orthorectifying Aerial Photographs

CHARLES MONDELLO PAST PRESIDENT PDC ASPRS FELLOW

APPLICATIONS AND LESSONS LEARNED WITH AIRBORNE MULTISPECTRAL IMAGING

IMAGE ACQUISITION GUIDELINES FOR SFM

PROPERTY OF THE LARGE FORMAT DIGITAL AERIAL CAMERA DMC II

Monitoring the vegetation success of a rehabilitated mine site using multispectral UAV imagery. Tim Whiteside & Renée Bartolo, eriss

Assessment of Unmanned Aerial Vehicle for Management of Disaster Information

Assessing the Accuracy of Ortho-image using Photogrammetric Unmanned Aerial System

Transcription:

Question: I would like to get your expert opinion on a dataset I just received. It is UAS-based imagery collected to produce a 50cm Digital Elevation Models (DEM) and 5cm resolution true color orthos. I do not have any other metadata related to the project. Is there any way to help me guess the horizontal and vertical accuracies of the generated products? Is there any ratio-based relationship between the horizontal and vertical accuracy of products generated from UAS? Dr. Srini Dharmapuri, Michael Baker International Pittsburgh (Beaver), PA Dr. Abdullah: Your question is very important to the community of geospatial mapping as it comes at a critical time when users, like you, are anxious and confused about the positional accuracy of products generated from an unmanned aerial system (UAS). Some UAS manufacturers are overselling their products without having a thorough understanding and appreciation for the topic of data positional accuracy. Very often, I listen to technical presentations at conferences I ve attend and hear highly exaggerated claims about product accuracy. You even hear the terms subcentimeter or even millimeter absolute accuracy during some of these presentations. I am not asserting that UAS-derived products cannot be produced with high accuracy, but I am saying that careful consideration needs to be taken when dealing with UAS-based sensors. The payload on board any small UAS, which forms the bulk of the UAS platforms utilized by the geospatial community, is characterized by miniaturized designs. Such reductions in size and weight of the payload forces a painful reality for the manufacturers of these small UAS s, as they have to deal with miniaturized and quality-compromised imaging and auxiliary sensors. Most of the cameras offered in the market for UAS are consumergrade and cost a few hundred dollars. Similarly, the GPS and inertial measurement unit (IMU) are characterized by degraded performance and accuracy. If the user is not educated enough on this reality, he or she may believe the false accuracy claims made by some UAS manufacturers or even data providers. Many precautions can minimize or even overcome the shortcomings of the sensors on board a small UAS. Things like starting with efficient flight planning to result in sufficient overlap between the imagery, using RTK or PPK-based GPS, providing a dense and accurate ground control network, and using the right processing software, to mention a few. All of these precautionary measures taken during mission planning help assure high-quality, highly accurate products. What makes this situation more challenging is the absence of legitimate and independent evaluation studies that users can trust to navigate their way when it comes to UAS-derived product accuracy. I am not aware of any governmental funding invested in the independent evaluation of products derived from small UAS. This bitter reality encourages me to share with you and other readers my recent experience along these lines. Woolpert, my employer, was one of the first geospatial and engineering companies to invest in UAS, and was the first surveying and aerial mapping company approved to fly a UAS commercially in designated airspace, earning an FAA Section 333 Exemption. Like other users, our clients questioned us about product accuracy. We too were in the dark about the accuracy of the UAS-derived elevation data and orthos until last year when we took a drastic measure to invest in an independent review of the accuracy of products derived from small UAS. In the next sections, I will discuss two case studies we conducted to measure the accuracy of products derived from small UAS. Very often, I listen to technical presentations at conferences I attend and hear highly exaggerated claims about product accuracy. You even hear the terms subcentimeter or even millimeter absolute accuracy during some of these presentations. Photogrammetric Engineering & Remote Sensing Vol. 83, No. 4, April 2017, pp. 255 260. 0099-1112/17/255 260 2017 American Society for Photogrammetry and Remote Sensing doi: 10.14358/PERS.83.4.255 Photogrammetric Engineering & Remote Sensing April 2017 255

Figure 2: Site layout for Case I. Figure 1: Different types of UAS operated by Woolpert. CASE I Site Survey Analysis Using Small UAS Although we operate a fleet of UAS, illustrated in Figure 1, we conducted this study using the Kespry system, which was flown over the 31-acre site surrounding our headquarters in Dayton, Ohio, Figure 2. This site was selected to represent a typical small survey job, such as a mall or a campus, and is ideal for small UAS operations. Case II, to be discussed later, is an ideal case study for a corridor mapping site. 24.7 million total pixels and 24.3 million effective pixels (around 4,000 x 6,000 pixels). The configuration of the lens and the sensor results in a field of view (FOV) of 52x72 degrees. CASE I Flight Design Six parallel flight lines were flown from an altitude of 350 feet AGL, resulting in image ground resolution (GSD) of 2.7cm, see Figure 4. Figure 3: Ground controls and checkpoint network. CASE I Ground Controls and Checkpoints Network Our team of surveyors conducted two independent surveys to establish the network of ground control and checkpoints needed for the study. The team also surveyed profiles for the curb gutters and sidewalks to assist in the accuracy analysis. Figure 3 illustrates the features surveyed in the two field surveys. CASE I The Imaging System The payload on the Kespry quad copter includes imaging and geo-location sensors. The imaging sensor is Sony Alpha ILCE- 5100 (α5100), equipped with a lens with 16mm focal length. The sensor in the α5100 camera is an APS-C size Exmor CMOS image sensor (23.5 x 15.6mm). It has approximately Figure 4: Imagery Layout for Case I. CASE I Data Processing and Product Generation The imagery was processed using Pix4D software. In addition to the imagery, the coarse GPS/IMU-derived exterior orientation parameters (easting, northing, elevation, omega, phi and kappa) and ground controls were imported into the software. Upon finalizing the aerial triangulation, referred to as optimization in Pix4D, two products were generated orthorectified tiles with a GSD of 2.5cm and a digital surface model (DSM) with post spacing of 5cm. These two products were used for the accuracy evaluation detailed in the coming sections. 256 April 2017 Photogrammetric Engineering & Remote Sensing

CASE I: Testing Methodology, Number and Configuration of Ground Control Network In order to evaluate positional accuracy for the Kespryderived products, different configurations of ground control number and distribution were planned and executed. Seven scenarios, A through G, were examined during the evaluation, see Figure 5. Scenario B, where no control points were used in the processing, is not shown in Figure 5. CASE I Vertical Accuracy Evaluation The vertical accuracy of the point clouds, a sample of which is illustrated in Figure 6, for each of the seven scenarios was assessed using TerraScan software of TerraSolid. An elevation value was derived for each of the checkpoints from the point cloud at the same location (easting and northing) derived from the orthorectified imagery. Discrepancies between the surveyed elevations and those derived from the point cloud were computed, from which the root mean square error (RMSE), the NSSDA accuracy figure at 95 percent confidence level, and other statistics were computed and tabulated in Table 1. Table 1 and Figure 7 list a summary of the vertical accuracy statistics for each of the seven scenarios. Figure 5: Ground controls evaluation scenarios for Case I. (The blue triangles represent control points used in the processing.) CASE I Horizontal Accuracy Evaluation The horizontal accuracy of the orthorectified imagery was assessed in ArcGIS for each scenario, A through G. Ortho tiles were imported to ArcGIS along with the shape file containing the checkpoints. Analysts modified the locations in the shape files to match each of the checkpoints to its location in the orthos. Once completed, the shape was saved and labeled according to that scenario. Pix4D does not yet support NAD83(2011) datum, so the processing may appear as if it was completed in NAD83(NSRS2007); in reality, all the products are in NAD83(2011). As both the ABGPS data and the ground controls were imported in their native NAD83(2011) and NAVD88 (12A) formats, Pix4D did not perform any internal conversion for the coordinate systems. Table 1 lists the summary of horizontal accuracy statistics for each of the seven scenarios. Figure 6: Colorized point clouds derived from the UAS-based imagery. Table 1: Horizontal and vertical accuracy from UAS products, Case I. Processing Scenario Accuracy Term A B C D E F G Number of Control Points 29 0 4 5 7 9 13 Number of Check Points 20 49 45 44 42 40 36 RMSE E (ft.) 0.22 2.34 0.16 0.18 0.17 0.18 0.18 RMSE N (ft.) 0.18 1.40 0.14 0.14 0.14 0.14 0.15 Radial RMSE N,E (ft.) 0.29 2.73 0.21 0.23 0.22 0.23 0.24 RMSE Elev. (ft.) 0.32 1.62 1.35 0.32 0.23 0.25 0.29 Horizontal Accuracy at 95% (ft.) 0.49 4.72 0.36 0.40 0.39 0.39 0.41 Vertical Accuracy at 95% (ft.) 0.62 3.17 2.65 0.63 0.45 0.49 0.57 careful consideration needs to be taken when dealing with UAS-based sensors. The payload on board any small UAS, which forms the bulk of the UAS platforms utilized by the geospatial community, is characterized by miniaturized design Photogrammetric Engineering & Remote Sensing April 2017 257

CASE II Ground Control and Checkpoints Network Our team of surveyors established a network of ground control and checkpoints needed for the study. A total of 38 welldefined points were surveyed to an accuracy of RMSE x,y,z = 0.1 feet. Figure 9 illustrates the ground/checkpoints surveyed for this evaluation. Figure 7: Horizontal and vertical accuracy from UAS products, Case I. In order for them to meet a 1cm vertical accuracy, their ground control should be surveyed to an accuracy of 0.25cm or better, according to the ASPRS Positional Accuracy Standards for Digital Geospatial Data. Such tight accuracy is hard if not impossible to meet using current GPS-based surveying techniques. CASE II Corridor Survey Using Small UAS Surrogate System Although we operate the fleet of UAS illustrated in Figure 1, we devised a system that mimics the operation of the UAS using manned aircraft. Even with PART107 of the new FAA regulations, we are still restricted from flying over people who are not participating in operating the UAS. To overcome this restriction, Woolpert manufactured a small pod to accommodate a payload that resembles the one on board a small UAS. We called it a UAS-surrogate and later it was officially given the name Renaissance. In order to evaluate UAS product accuracy over highways, we would have to deploy the Renaissance for the evaluation, as we are not restricted by the FAA rules to fly over busy highways using a manned aircraft as it is the case with UAS. The pod of the Renaissance is mounted on the belly of a Cessna 182, Figure 8. The flight was conducted over a 1.3-mile stretch of County Line Road in Dayton, Ohio. Figure 9: Flight layout and ground controls/checkpoints network for Case II (purple plus sign = photo center, green triangle = control/checkpoint). CASE II The Imaging System The payload on the Renaissance includes imaging and geolocation sensors. The imaging sensor is NIKON D800E, equipped with a lens with an 85mm focal length. The sensor contains around 36 million pixels (7,360 4,912 pixels), with dimensions of 36 24mm. The configuration of the lens and the sensor results in a FOV of 23.85x16 degrees. CASE II Flight Design Five parallel flight lines in the north-south direction were flown from an altitude of 1,100 feet AGL, resulting in image ground resolution (GSD) of 2.0cm (see Figure 9). The additional three short east-west lines only were flown to cover Woolpert headquarters. Figure 8: Podded Renaissance system. CASE II Data Processing and Product Generation The imagery was processed following the same procedure and processing software used for Case I. 258 April 2017 Photogrammetric Engineering & Remote Sensing

Figure 10: Ground Control Evaluation Scenarios for Case II. (Blue triangles represent control points used in the processing.) CASE II Testing Methodology, Number and Configuration of Ground Control Network Seven scenarios, A through G, were examined during the evaluation (see Figure 10). Scenario A, where no control points used in the processing, is not shown in Figure 10. CASE II Horizontal Accuracy Evaluation The horizontal accuracy of the orthorectified imagery was assessed in a similar fashion to the method used in Case I. Table 2 lists the summary of the horizontal accuracy statistics for each of the seven scenarios. CASE II Vertical Accuracy Evaluation The vertical accuracy of the point clouds was assessed in a similar fashion to the method used in Case I. Table 2 lists a summary of the vertical accuracy statistics for each of the seven scenarios. Results Analysis UAS Results From Table 1, Case B, it is obvious that for a small square or rectangular shaped project similar to the one discussed in Case I, one can obtain submeter horizontal and vertical accuracy from UAS-derived products without having any ground control points used in the processing, i.e. airborne GPS only. However, with four ground control points, one at each corner of the block, the horizontal accuracy is stabilized to under 0.20 feet. Additional ground control points, beyond the four corner points, do not seem to benefit the horizontal accuracy of the block (see Case C of the table). The story is little different for the vertical accuracy, as the four corner points did not result with the desired vertical accuracy. Reasonable vertical root mean squares error (RMSEv) was only reached after adding a fifth ground control point near the center of the block. Adding more ground control beyond the five points did not improve the vertical or the horizontal accuracy of the block, see Table 1 and Figure 7. Renaissance Results From Table 2, Case A, it is obvious that for corridor-type projects similar to the one discussed in Table 2: Horizontal and vertical accuracy of Renaissance products, Case II. Case II, that we can obtain 5-foot Processing Scenario horizontal accuracy and 14-foot Accuracy Term A B C D E F G vertical accuracy for products derived from UAS-surrogate system flown Number of Control Points 0 4 6 8 10 21 38 from a manned aircraft at an altitude Number of Check Points 38 34 32 30 28 17 0 of 1,100 feet AGL without having RMSE E (ft.) 4.47 0.23 0.16 0.18 0.13 0.05 0.05 any ground control points used in the RMSE N (ft.) 1.89 0.26 0.20 0.14 0.14 0.07 0.05 processing, i.e. airborne GPS only. Radial RMSE N,E (ft.) 4.86 0.35 0.26 0.23 0.19 0.08 0.06 The coarse vertical accuracy in Case A RMSE Elev. (ft.) 13.51 0.54 0.71 0.40 0.35 0.26 0.17 can be attributed to the combination of the uncalibrated focal length of the Horizontal Accuracy at 95% (ft.) 8.40 0.60 0.45 0.39 0.34 0.14 0.11 lens on the camera and the rough Vertical Accuracy at 95% (ft.) 26.49 1.05 1.40 0.78 0.69 0.52 0.34 Photogrammetric Engineering & Remote Sensing April 2017 259

vertical accuracy of the airborne GPS used to process the imagery. However, with four ground control points, two at each end of the 1.3-mile corridor, the horizontal accuracy and vertical accuracies came down to the submeter level. Additional ground control points along the corridor helped bring the horizontal accuracy to under 0.20 feet and the vertical accuracy to under 0.30 feet. To assure vertical accuracy of RMSE = 0.25 feet or better, it is recommended to have a pair of ground controls every 500 to 700 feet along the corridor. Conclusions and Recommendations 1. The controlled experiments discussed in Cases I and II clearly show that one can obtain vertical and horizontal accuracy of 0.25 feet or better from UASderived products with the proper mission planning and ground control design. Better results may be possible if the flying altitude is lowered and a different control network is used. 2. The latest actual corridor results using the Renaissance system exceeded the recommended spacing of ground controls stated in the conclusions for Case II. On a recent corridor construction project, where we fly over our client s project every three months, our client s team of surveyors are finding that we are meeting a 0.20 feet vertical accuracy from 3cm imagery flown from an altitude of 1,750 feet AGL. The average spacing between pairs of ground control along the 19 miles corridor was around 3,000 feet. This is a real testament on the accuracy of the products derived from a UAS surrogate, as the surveyors did an extensive field check using the latest technique in field surveying multiple times. 3. One needs to be aware of the accuracy requirements for the ground control and checkpoints used in the photogrammetric workflow to produce orthorectified imagery and digital elevation model from UAS-based imagery. The new ASPRS mapping standard, ASPRS Positional Accuracy Standards for Digital Geospatial Data, calls for the accuracy of the ground control points used to produce any products through the photogrammetric process to be always twice as good as the accuracy expected for the generated products. Therefore, the accuracy of the ground control used in the aerial triangulation process should be two times more accurate than the expected accuracy for aerial triangulation. In the same token, the accuracy of the aerial triangulation should be two times better than the accuracy of the orthos and/or the digital elevation model produced using the triangulated imagery. In other words, according to the ASPRS Positional Accuracy Standards for Digital Geospatial Data, the following accuracy figures/relationship needs to be satisfied: Ground control points used for aerial triangulation should have higher accuracy than the expected accuracy of derived products according to the following two categories: Accuracy of ground control designed for planimetric data (orthoimagery and/or digital planimetric map) production only: RMSE x or RMSEy = 1/4 * RMSEx (Map) or RMSEy (Map), RMSEz = 1/2 * RMSEx (Map) or RMSEy (Map) Accuracy of ground control designed for elevation data, or planimetric data and elevation data production: RMSE x, RMSEy or RMSEz = 1/4 * RMSEx (Map), RMSEy (Map) or RMSEz (DEM) 4. It is important to understand the above accuracy requirements to challenge people who claim that they can meet subcentimeter accuracy from UAS. In order for them to meet a 1cm vertical accuracy, their ground control should be surveyed to an accuracy of 0.25cm or better, according to the ASPRS Positional Accuracy Standards for Digital Geospatial Data. Such tight accuracy is hard if not impossible to meet using current GPS-based surveying techniques. 5. All results discussed in Cases I and II are based on standard consumer-grade GPS with accuracy of 1.0 to 2.0 meters. Using RTK-based GPS for UAS operations shall definitely result in an improvement in the accuracy of the derived products. Finally, going back to your question and as you see from my case studies, it is difficult to predict any accuracy figures for your products without knowing the operational details surrounding the mission circumstances that may affect the accuracy of the derived products. As for your question on whether a ratio-based relationship exists between horizontal and vertical accuracy, I did not noticed any other than that the horizontal accuracy is stabilized with less ground control points than the number needed to bring the vertical accuracy to a reasonable range. I hope the examples and the recommendations I have provided here will provide some guidance for you and other readers when dealing with UASderived products in the future. **Dr. Abdullah is Senior Geospatial Scientist and Associate at Woolpert, Inc. He is ASPRS fellow and the 2010 recipient of the ASPRS Photogrammetric Fairchild Award.. The contents of this column reflect the views of the author, who is responsible for the facts and accuracy of the data presented herein. The contents do not necessarily reflect the official views or policies of the American Society for Photogrammetry and Remote Sensing and/or Woolpert, Inc. 260 April 2017 Photogrammetric Engineering & Remote Sensing