INSTRUMENT DESIGN FOR THE PEGASUS HALE UAV PAYLOAD T. Van Achteren, B. Delauré, J. Everaerts

Similar documents
MEDUSA A WIDE SWATH HIGH RESOLUTION DIGITAL CAMERA FOR THE PEGASUS SYSTEM

Consumer digital CCD cameras

PEGASUS : a future tool for providing near real-time high resolution data for disaster management. Lewyckyj Nicolas

Phase One 190MP Aerial System

Vexcel Imaging GmbH Innovating in Photogrammetry: UltraCamXp, UltraCamLp and UltraMap

Remote Sensing Platforms

OPAL Optical Profiling of the Atmospheric Limb

Leica ADS80 - Digital Airborne Imaging Solution NAIP, Salt Lake City 4 December 2008

Comparison of off-axis TMA and FMA telescopes optimized over different fields of view: applications to Earth observation

POTENTIAL OF LARGE FORMAT DIGITAL AERIAL CAMERAS. Dr. Karsten Jacobsen Leibniz University Hannover, Germany

Remote Sensing Platforms

Chapters 1-3. Chapter 1: Introduction and applications of photogrammetry Chapter 2: Electro-magnetic radiation. Chapter 3: Basic optics

Compact Dual Field-of-View Telescope for Small Satellite Payloads

NEWS FROM THE ULTRACAM CAMERA LINE-UP INTRODUCTION

UltraCam Eagle Prime Aerial Sensor Calibration and Validation

Congress Best Paper Award

Chapters 1-3. Chapter 1: Introduction and applications of photogrammetry Chapter 2: Electro-magnetic radiation. Chapter 3: Basic optics

A 1m Resolution Camera For Small Satellites

Compact High Resolution Imaging Spectrometer (CHRIS) siraelectro-optics

VisionMap Sensors and Processing Roadmap

Rochester Institute of Technology. Wildfire Airborne Sensor Program (WASP) Project Overview

The Challenge. SPOT Vegetation. miniaturization. Proba Vegetation. Technology assessment:

While film cameras still

PROPERTY OF THE LARGE FORMAT DIGITAL AERIAL CAMERA DMC II

CALIBRATION OF OPTICAL SATELLITE SENSORS

UltraCam and UltraMap Towards All in One Solution by Photogrammetry

US Commercial Imaging Satellites

ULTRACAM EAGLE MARK 3. One system for endless possibilities

International Conference on Space Optics ICSO 2014 La Caleta, Tenerife, Canary Islands 7 10 October /cso _2014 ono ' r

Advanced Optical Satellite (ALOS-3) Overviews

Airborne or Spaceborne Images for Topographic Mapping?

CHARACTERISTICS OF REMOTELY SENSED IMAGERY. Spatial Resolution

An Introduction to Geomatics. Prepared by: Dr. Maher A. El-Hallaq خاص بطلبة مساق مقدمة في علم. Associate Professor of Surveying IUG

PRODUCT OVERVIEW FOR THE. Corona 350 II FLIR SYSTEMS POLYTECH AB

VisionMap A3 Edge A Single Camera for Multiple Solutions

Compact Multispectral and Hyperspectral Imagers based on a Wide Field of View TMA

Leica - 3 rd Generation Airborne Digital Sensors Features / Benefits for Remote Sensing & Environmental Applications

DETERMINATION AND IMPROVEMENT OF SPATIAL RESOLUTION FOR DIGITAL ARIAL IMAGES

LECTURE NOTES 2016 CONTENTS. Sensors and Platforms for Acquisition of Aerial and Satellite Image Data

9/12/2011. Training Course Remote Sensing Basic Theory & Image Processing Methods September 2011

A LATERAL SENSOR FOR THE ALIGNMENT OF TWO FORMATION-FLYING SATELLITES

Abstract Quickbird Vs Aerial photos in identifying man-made objects

CALIBRATION OF IMAGING SATELLITE SENSORS

CALIBRATING THE NEW ULTRACAM OSPREY OBLIQUE AERIAL SENSOR Michael Gruber, Wolfgang Walcher

Camera Calibration Certificate No: DMC III 27542

Govt. Engineering College Jhalawar Model Question Paper Subject- Remote Sensing & GIS

UltraCam and UltraMap An Update

HIGH RESOLUTION COLOR IMAGERY FOR ORTHOMAPS AND REMOTE SENSING. Author: Peter Fricker Director Product Management Image Sensors

THE SPACE TECHNOLOGY RESEARCH VEHICLE 2 MEDIUM WAVE INFRA RED IMAGER

Hyper-spectral, UHD imaging NANO-SAT formations or HAPS to detect, identify, geolocate and track; CBRN gases, fuel vapors and other substances

Mission requirements and satellite overview

RADIOMETRIC CAMERA CALIBRATION OF THE BiLSAT SMALL SATELLITE: PRELIMINARY RESULTS

The studies began when the Tiros satellites (1960) provided man s first synoptic view of the Earth s weather systems.

IMAGE SENSOR SOLUTIONS. KAC-96-1/5" Lens Kit. KODAK KAC-96-1/5" Lens Kit. for use with the KODAK CMOS Image Sensors. November 2004 Revision 2

Philpot & Philipson: Remote Sensing Fundamentals Scanners 8.1 W.D. Philpot, Cornell University, Fall 2015

PAPER NUMBER: PAPER TITLE: Multi-band CMOS Sensor simplify FPA design. SPIE, Remote sensing 2015, Toulouse, France.

Status of MOLI development MOLI (Multi-footprint Observation Lidar and Imager)

Compact camera module testing equipment with a conversion lens

Calibration of a Multi-Spectral CubeSat with LandSat Filters

OVERVIEW OF KOMPSAT-3A CALIBRATION AND VALIDATION

Satellite/Aircraft Imaging Systems Imaging Sensors Standard scanner designs Image data formats

ROLE OF SATELLITE DATA APPLICATION IN CADASTRAL MAP AND DIGITIZATION OF LAND RECORDS DR.T. RAVISANKAR GROUP HEAD (LRUMG) RSAA/NRSC/ISRO /DOS HYDERABAD

The Hyperspectral UAV (HyUAV) a novel UAV-based spectroscopy tool for environmental monitoring

18. Infra-Red Imaging Subsystem (IRIS)

RPAS Photogrammetric Mapping Workflow and Accuracy

Calibration Report. Short Version. Vexcel Imaging GmbH, A-8010 Graz, Austria

Camera Calibration Certificate No: DMC IIe

RECONNAISSANCE PAYLOADS FOR RESPONSIVE SPACE

Calibration Certificate

MEDIUM FORMAT CAMERA EVALUATION BASED ON THE LATEST PHASE ONE TECHNOLOGY

Digital airborne cameras Status & future

CXCI. Optical design of a compact telescope for the next generation Earth Observation system CXCI. Vincent COSTES. Octobre 2012

Spatially Resolved Backscatter Ceilometer

EPS Bridge Low-Cost Satellite

Camera Calibration Certificate No: DMC II

Low Cost Earth Sensor based on Oxygen Airglow

Photons and solid state detection

Camera Calibration Certificate No: DMC II

digital film technology Resolution Matters what's in a pattern white paper standing the test of time

Camera Calibration Certificate No: DMC II Aero Photo Europe Investigation

Module 3 Introduction to GIS. Lecture 8 GIS data acquisition

EXAMPLES OF TOPOGRAPHIC MAPS PRODUCED FROM SPACE AND ACHIEVED ACCURACY CARAVAN Workshop on Mapping from Space, Phnom Penh, June 2000

Camera Calibration Certificate No: DMC II

Calibration Report. Short version. UltraCam Xp, S/N UC-SXp Vexcel Imaging GmbH, A-8010 Graz, Austria

Camera Calibration Certificate No: DMC II

Spectral Analysis of the LUND/DMI Earthshine Telescope and Filters

Calibration Report. Short Version. UltraCam L, S/N UC-L Vexcel Imaging GmbH, A-8010 Graz, Austria

Overview. Objectives. The ultimate goal is to compare the performance that different equipment offers us in a photogrammetric flight.

Camera Calibration Certificate No: DMC II

ULTRACAMX AND A NEW WAY OF PHOTOGRAMMETRIC PROCESSING

HIGH RESOLUTION IMAGERY FOR MAPPING AND LANDSCAPE MONITORING

ABOUT FRAME VERSUS PUSH-BROOM AERIAL CAMERAS

Phase One ixu-rs1000 Accuracy Assessment Report Yu. Raizman, PhaseOne.Industrial, Israel

LENSES. INEL 6088 Computer Vision

METimage an innovative imaging radiometer for Post-EPS

The Z/I Imaging Digital Aerial Camera System

CubeSat Integration into the Space Situational Awareness Architecture

Int n r t o r d o u d c u ti t on o n to t o Remote Sensing

Update on Landsat Program and Landsat Data Continuity Mission

Hyperspectral Imager for Coastal Ocean (HICO)

Transcription:

INSTRUMENT DESIGN FOR THE PEGASUS HALE UAV PAYLOAD T. Van Achteren, B. Delauré, J. Everaerts Flemish Institute for Technological Research (VITO) Centre for Remote Sensing and Earth Observation (TAP) Boeretang 200, B-2400 Mol, Belgium Tel. +32 14 336854; Fax +32 14 322795 (tanja.vanachteren, bavo.delaure, jurgen.everaerts)@vito.be KEY WORDS: Camera, High resolution, Performance, Design, Modelling, Simulation, Cartography, Photogrammetry ABSTRACT: In 2000, VITO initiated the Pegasus project to demonstrate the feasibility of remote sensing from a HALE UAV. Ultimately the HALE UAV platform will be equipped with a variety of light weight high resolution remote sensing instruments. The synergy of the onboard sensors thus allows continuous observations, irrespective of light or weather conditions. In a first phase a multi-spectral digital camera will be developed and embarked on the solar-powered Mercator1 platform. The payload will also contain a GPS receiver and Inertial Measurement Unit (IMU) for position and attitude determination of the camera at the instant of exposure. A direct downlink will allow near-real time data delivery to the user. The study, described in this paper, investigates the feasibility of the top-level requirements of the multi-spectral camera under the strict physical and environmental constraints of the Mercator1 platform flying at 18 km altitude. 1. INTRODUCTION Satellites or manned aircraft have been used for remote sensing for many years. They have distinct advantages (e.g. global coverage and high update rate ability for satellites; high spatial resolution and precision combined with great flexibility for aircraft), but some drawbacks as well (e.g. imprecise georeferencing for satellite data; potential delays due to air traffic or weather conditions for aircraft). Driven by the technological evolution of the last ten years a novel platform is entering the scene: the High Altitude Long Endurance Unmanned Aerial Vehicle (HALE-UAV). It offers a combination of the advantages of airborne and spaceborne platforms while minimizing their drawbacks. Flying above air traffic at stratospheric altitudes, its flexibility allows to take any opportunity for data acquisition by exploiting for instance holes in the cloud cover. In response to the demand of the remote sensing market, the persistent availability of the HALE UAV therefore allows to produce high resolution images at a regional coverage. This combination is neither accessible from airborne (local coverage) or spaceborne platforms (inferior resolution). A second important pro is the excellent hovering capability which opens new possibilities for continuous event monitoring (during for instance crisis situations) with update rates of less than 30 min. In fact, it is a realization of a regional geostationary platform. To exploit the potential of the HALE UAV efficiently, the onboard instruments should be adapted to the specific environment and the unique operational regime of this innovative platform. Therefore the payload development requires a reevaluation of the design parameters of RS instruments. The payload system design starts with a complete requirements analysis, imposed by the target application area, the platform on which the payload will be mounted and the environmental conditions. For the Mercator1 platform, weight and volume are stringent constraints, imposing a careful analysis of the freedom we have in subcomponent selection and their integration. Optimization of the global payload system design is governed by trade-offs between different requirements and subsystem performances. In this paper we introduce a global system performance model of a multi-spectral camera payload and its subcomponents, the parameters which define the performance, costs, and constraints and the global trade-offs involved when making decisions on the final camera system. We further illustrate the design parameters and the corresponding trade-offs by making a comparison between the HALE UAV instrument and an example of an airborne and spaceborne system: Vexcel Ultracam D and IKONOS2. This study was part of the phase B preliminary design of an optical payload called MEDUSA to be mounted on the Mercator1 platform. 2. DEVELOPMENT FRAMEWORK 2.1 Overview of the camera system The central part of the MEDUSA payload is a combination of two frame sensors, one panchromatic and one with RGB filters. The payload consists of several subcomponents: Optics (lenses and/or mirrors) Focal Plane assembly (FPA) = sensors and Front-end electronics GPS L1/L2 antenna and receiver, Inertial Measurement Unit (IMU), Command & Data Handling Unit (C&DHU) S-band (2 GHz) antenna and transmitter. The presence of the GPS receiver and the IMU allows direct geo-referencing of the camera images. The on-board data processing consists of time-tagging, basic image corrections, organizing and compressing data. Processing and archiving will be conducted on-ground where data will be received by the ground station and forwarded to a Central Data Processing Centre (CDPC) at VITO, Belgium. A schematic overview of the payload subcomponents is given in Figure 1.

The MEDUSA camera system subsystems are installed in a light-weight carbon fibre support frame which serves at the same time as housing. Compared to typical airborne high resolution cameras the MEDUSA payload has very strict and challenging physical and environmental constraints: Total weight < 2kg Power consumption < 50 W Max. Aperture D = 10 cm Length of the payload < 1m Pixel size < 5.5 µm Attitude variations of the UAV Thermal environment: -70 C non-operational temperature and thermal cycling over the day (-40 to 30 C) Low pressure: 60 mbar 2.3 Comparison with airborne and spaceborne cameras Figure 1. MEDUSA camera system and its subsystems Figure 2 shows a schematic layout of the payload housing that will be mounted in front of the fuselage of the UAV. Aerodynamic fairings will be installed at the front and the backside of the housing to minimize drag. The available volume for the payload is a cylinder with length L = 1000 mm and outer diameter D = 120 mm. The free inner diameter D = 110 mm. Mirror FPA Lens groups Figure 2. Schematic view of payload housing and its content. 2.2 Top-level user requirements and constraints The following lists the top-level user requirements for the MEDUSA payload: Ground resolution: 30 cm (@ 18 km ) or less Wavelength range: 400 650 nm (RGB) Swath width: 3000 m (>= 10 000 pixels) SNR = 100 @ 8:00 am equinox Frame sensor with electronic shuttering: 10000x1200 pixels 60% overlap between images for Block bundle adjustment RF downlink within a range of 150 km from the ground station IMU GPS CDHU E-box tray GPS antenna Transmitter In Table 1 we compare a number of technical specifications of the MEDUSA camera system flying onboard a UAV at 18 km altitude and one example of a high resolution airborne camera (Vexcel Ultracam D ) (Leberl, 2005; Vexcel, 2006) and a high resolution spaceborne camera (Ikonos-2) (Eoportal, 2006). The MEDUSA camera is designed to fill the gap between traditional airborne and spaceborne instruments regarding resolution and coverage. It targets applications such as disaster management and cartography, requiring high resolution images with regional coverage, flexible trajectories, high update rates and longer mission lengths. Vexcel Ultracam D MEDUSA IKONOS2 airborne airborne (stratosphere) spaceborne Coverage local regional global Camera type multi-frame array frame array pushbroom linear array Optics four-in-line lens system refractive lens system Cassegrain reflective system sensor size 11000x7500 pixels 10000x1200 pixels 13500 pixels (PAN) (PAN) (PAN) 4008x2672 pixels (MS) 10000 x 1200 pixels 3375 pixels (MS) (MS) GSD 0.05 m @ 0,5 km 0.3 m 1 m aperture 0,018 m (@ f/5.6) <= 0.1 m 0.7 m pixel size 9 µm 5.5 µm 12 µm wavelength range RGB + NIR 400-650 nm 450-900 nm focal length 0.1 m 0.33 m 10 m altitude 0,5 km 18 km 681 km swath 0,55 km @ 0,5 km 3 km 13 km frame rate 1 frame/second 0.7 frames/second 6500 lines/second On-board compression On-board storage: TIFF, JPEG, Tiled TIFF low-loss JPEG2000 (ratio 8.5) low-loss ADPCM (ratio 4.24) weight < 45 kg (sensor unit) 2 kg 171 kg < 65 kg (Control, data storage and processing unit) power consumption 150 W (sensor unit) 50 W 350 W 700 W (Control, data storage and processing unit) motion compensations thermal environment stabilized, TDI controlled no stabilized mount, very small integration times thermal environment @ -70 C non-operational, - 0,5 km 40 C-+30 C operational stabilized, TDI controlled thermal control pressure air pressure @ 0,5 km 60 mbar vacuum Table 1. Comparison of the MEDUSA camera system with traditional high resolution airborne and spaceborne remote sensing cameras The challenge lies mainly in realizing the camera specifications within the extreme environmental and physical constraints. Compared to traditional airborne and spaceborne systems, the MEDUSA camera system is ultra light weight and has only a limited amount of power available for the on-board electronics. Moreover, it is operated in a low pressure and low temperature environment which undergoes thermal cycling and needs to compensate for platform attitude variations within the strict weight and power constraints.

3. SYSTEM-LEVEL TRADEOFFS In this study, the feasibility of the MEDUSA top-level requirements under the given physical and environmental constraints has been investigated. The main cost parameters defining the camera system performance are listed hereafter: The modulation transfer function (MTF) Signal-to-noise ratio (SNR) Ground sampling distance (GSD) Frame rate and readout rate of the sensor Spectral range of the sensor To make system-level trade-offs in a fast and automatic way, a set of analysis tools has been developed as part of the Phase B preliminary design. We have built an image simulator in Matlab which evaluates the image degradation by the different subcomponents in the system on an arbitrary high resolution input image. Another tool calculates the signal-to-noise ratio of the system as a function of numerous parameters, such as the at-sensor radiance, GSD, optical transmission coefficient and quantum efficiency of the sensor. From this analysis more detailed subsystem requirements have been derived as input for the preliminary design of the subsystems. The reported subsystem performance can be fed back in the analysis tools to verify the compliance of the system to the top-level requirements. 3.1 Ground sampling distance, focal length and pixel size To realize a Ground Sampling Distance (GSD) from a given altitude h, the focal length f of the optical system and the pitch p of the sensor should be such that the following relation holds GSD p = (1) h f for a system focused at infinity. Figure 3 shows the focal length as a function of a range of typical sensor pixel sizes. This is done for three GSD values (from 18 km altitude). which has to fit in the inner diameter of 11 cm of the payload housing. Since the focal length and diameter of the optical system increases with the pixel size, it is best to use the smallest possible pixel size for the sensor. Moreover, the focal length increases with decreasing ground sampling distance. The maximum focal length thus also limits the smallest ground sampling achievable within the given volume and weight constraints. A pixel size of 5.5 µm results in a focal length of 330 mm at GSD = 30 cm and a sensor width of 5.5 cm, and this is considered to be the maximum pixel size for the MEDUSA camera. In addition, electronic snapshot shuttering is required since a mechanical shutter is too heavy and its reliability is doubtable due to low temperature at high altitude. To our knowledge there is no wide swath imaging sensor on the market today with an electronic snapshot shutter and a pixel size smaller or equal than 5.5 µm. Therefore a feasibility study for a custom designed wide swath sensor has been executed within Cypress Semiconductor Corporation Belgium as part of the MEDUSA phase B preliminary design. 3.2 Modulation transfer function 3.2.1 Definition The modulation transfer function (MTF) is the spatial frequency response of an imaging system or a component; it is the contrast at a given spatial frequency relative to the contrast at low frequencies. Spatial frequency is typically measured in cycles or line pairs per millimeter (lp/mm), which is analogous to cycles per second (Hertz) in audio systems. High spatial frequencies correspond to fine image detail. The more extended the response, the finer the detail, the sharper the image. The Nyquist frequency of a sensor is equal to the highest frequency in line pairs/mm detectable by the sensor: 1 f N = (2) 2 * p where p is the pixel size in mm. Similar to e.g. the Quickbird instrument (Scott, 2004), the target system MTF for our MEDUSA payload is put at 10% at the sensor Nyquist frequency. Following Equation Eq. (1) and a pixel size of 5.5 µm, the Nyquist frequency of the MEDUSA camera becomes 91 lp/mm. 3.2.2 System MTF contributions Figure 3. Focal length as a function of sensor pitch for three different ground sampling distances. The strict volume requirements on the payload structure impose constraints on the focal length and the pixel size. The focal length has a direct impact on the effective length of the optical system, more dominantly in case of a refractive lens system. As will be discussed in Section 3.3 a reflective optical system is not an option for the MEDUSA payload due to its lower optical performance. In addition, the pixel size has an impact on the diagonal of the sensor sensitive area and the optical system, Each of the subcomponents in our camera system and its environment contribute to the image degradation and blurring. Possible contributors are turbulence in the atmosphere, resolution of the optics (lenses and mirrors), finite sampling of the image sensor, motion blur along the flight direction, lossy data compression, non-zero bit error rate during transmission. The system MTF is obtained by applying the Fourier transform on a knife edge image and multiplying the influence of the different subcomponents in the spatial frequency domain. MTF = MTF MTF sys atmosphere motion optics compression sensor transmission Currently, theoretical MTF models for the major influences - atmosphere, optics, sensor and platform motion have been taken into account. Further work will include the influence of (3)

noise electrons in the sensor, data compression and transmission effects, but are (for now) expected to be of less impact than the optics and sensor MTF. A reflective telescope solution consists of a folding mirror, a primary and secondary mirror and correcting lenses before the sensor. (Fried, 1966) describes the MTF image degradation due to turbulence in the Earth's atmosphere. Sea-level visibility is assumed for remote sensing within Belgium. A very simple model for the optics MTF is extracted from Koren,2001). Jacobson, 2006) gives the formula for pure diffraction, which is an upper limit for the optics MTF. The sensor MTF (ASPRS, 2004) depends on the resolution and the fill factor of the sensor. Finally, the motion MTF (ASPRS, 2004) describes the blurring of the image due to the motion and attitude variations of the platform during exposure. A maximum motion blur of 0,5 pixel has been taken into account, which results in an integration time of 0.550 ms. These MTF models have been used to derive MTF requirements for the subsystems optics and sensor in order to obtain a system MTF of at least 10% at the sensor Nyquist frequency. The total system MTF can be calculated by multiplying together the MTF curves for the different subcomponent influences, see Figure 4. The black solid curve corresponds to the total simulated system MTF curve. Input image : Vexcel Ultracam @ 10cm GSD 1 0,9 0,8 0,7 0,6 MTF 0,5 0,4 0,3 0,2 0,1 0 0 10 20 30 40 50 60 70 80 90 100 110 120 130 140 150 160 170 180 lp/mm Atmosphere Diffraction limit Optics Sensor Motion Total Figure 4. Total simulated system MTF taking into account models for the main system subcomponent influences. 3.2.3 Image simulator We have built an image simulator in Matlab which evaluates the image degradation by the different subcomponents in the system on an arbitrary high resolution input image. Currently, theoretical MTF models for the major influences - atmosphere, optics, sensor and platform motion have been implemented in the spatial frequency domain. The impact of parasitic light sensitivity (PLS) noise has been implemented in the spatial image domain. For a snapshot shutter device, the PLS is of utmost importance and should be as small as possible. The PLS is a key parameter that indicates the sensitivity to light of the storage element in the pixel compared to the photodiode sensitivity during readout. The expected effect of contrast reduction and sensor resolution for the MEDUSA camera is illustrated on a Vexcel Ultracam input image (Vexcel, 2006) at 10 cm resolution in Figure 5. 3.3 Optical system trade-offs Two concepts have been considered for the optical system. Simulated output image : MEDUSA camera @ 30cm GSD Figure 5. MTF contrast reduction in an image simulator. Advantages of a reflective system are : Smaller volume (the focal length is folded up) Typically lighter than a system with lenses Chromatic aberrations are less an issue. Disadvantages of the reflective solution are: Due to the large aberrations for large field of view (FOV), the optical performance degrades considerably at the edges of the image. The optical resolution is degraded considerably due to the obscuration of the secondary mirror. This reduces the effective aperture diameter and thus the diffraction limited MTF curve (Smith, 2000). An obscuration also reduces the light transmission which is a problem to achieve a good signal-to-noise ratio as explained in Section 3.4. A refractive solution consists of a folding mirror and three groups of lenses. Advantages of a refractive system are: A large FOV is less a problem than for a reflective system Much better MTF performance can be achieved. The light transmission is much higher, since there is no obscuration. Higher SNR can be achieved. Disadvantages of a refractive system are:

Higher weight Chromatic aberrations have to be corrected In conclusion, since a reflective solution does not at all achieve the optical resolution we are targeting with the MEDUSA payload, a refractive solution has been selected. This inevitably results in a higher mass. 3.4 Signal-to-noise ratio Four undesirable signal components (noise), which degrade the performance of a CMOS imaging device by lowering signal-tonoise ratio, are considered in calculating overall SNR: Shot noise, dark noise, read noise and PLS noise. PLS noise is the statistical variation on the PLS offset added to the original signal during readout time. PLS corrections in the C&DHU remove the PLS offset value for a large part. The PLS noise is what remains and is included in the SNR calculations. The PLS noise has been calculated for the Vexcel Ultracam picture used in the MTF simulations (Figure 5). Due to the considerable amount of motion blur during the readout time, the offset averages out quite homogeneously over the complete image and the PLS noise is very small. The at-sensor radiance per channel has been calculated using MODTRAN 4. Our SNR calculations have been performed on the Kodak KAI-11000 CCD sensor. The custom CMOS sensor by Cypress is designed to have a similar spectral response and quantum efficiency as the Kodak KAI-11000 CCD sensor. Figure 6 shows the SNR as a function of the integration time for different GSD values with a Bayer color filter for the worst case in terms of light intensity (20 solar elevation, albedo 0.05) and an optical transmission coefficient of 60% at 100mm aperture. To achieve an SNR of at least 100 for a GSD of 30cm, an integration time of at least 2.6 ms is required. SNR SNR for Kodak KAI11000 for color sensor: blue band 20 solar elevation, albedo 0.05 200,00 180,00 160,00 140,00 120,00 100,00 80,00 60,00 40,00 20,00 0,00 0,00 0,20 0,40 0,60 0,80 1,00 1,20 1,40 1,60 1,80 2,00 2,20 2,40 2,60 2,80 3,00 Tint (ms) GSD=20cm GSD=30cm GSD=40cm Figure 6. SNR as a function of integration time and GSD for a color sensor in the wavelength range 400-650 nm. On the other hand, due to the forward motion of the platform and more dominantly the attitude variations of the platform during exposure, the maximum integration time for 0.5 pixel motion blur at a GSD 30cm is 0.550 ms. The motion blur increases linearly with the integration time as illustrated in Figure 7. There are different ways to increase the SNR: Increasing the GSD increases the amount of power received by a pixel and thus increases the SNR. A panchromatic sensor receives about 4 times more light per pixel, and increases the SNR. The panchromatic sensor can be combined with a lower resolution color sensor for pan-sharpening (see Section 3.5). Allowing more motion blur, the integration time can be increased, but at the same time the system MTF is slightly reduced. Reduce the acquisition window during the day to stay above higher solar elevations. motion blur (pixels): X 2 sigma 7,50 7,00 6,50 6,00 5,50 5,00 4,50 4,00 3,50 3,00 2,50 2,00 1,50 1,00 0,50 0,00 0,00 0,20 0,40 0,60 0,80 1,00 1,20 1,40 1,60 1,80 2,00 2,20 2,40 2,60 2,80 3,00 3,20 3,40 3,60 3,80 4,00 4,20 4,40 4,60 4,80 5,00 Tint (ms) GSD=20cm GSD=30cm GSD=40cm Figure 7. Motion blur in pixels versus the integration time and GSD. Increasing the GSD is not preferred, since high resolution is needed for cartography, one of the applications the MEDUSA camera system is targeting. Figure 8 shows the SNR for the Kodak KAI-11000 CCD sensor as a function of the integration time for different solar elevations for a panchromatic sensor. To achieve an SNR of at least 100 for a GSD of 30cm at equinox, an integration time of at least 1.1 ms is required. Further, if we allow a motion blur of 1 pixel, an integration time of at most 1.1 ms is allowed. The system MTF decreases but is still above the 10% system MTF requirement. SNR 180 160 140 120 100 80 60 40 20 0 SNR @ GSD=30cm; Motion blur=1 pixel Kodak KAI11000 Panchromatic (400-650nm) 20 20 20 20 20 35 35 35 35 35 50 50 50 50 50 0,05 0,10 0,15 0,20 0,25 0,05 0,10 0,15 0,20 0,25 0,05 0,10 0,15 0,20 0,25 Basic SNR Including PLS noise Figure 8. SNR results for a panchromatic sensor in the 400-650 nm wavelength range for different solar elevations and albedo values. 3.5 Camera concept and sensor configuration Solar el. As discussed in Section 3.4, the SNR>100 requirement can only be met with a panchromatic sensor. To obtain a high resolution color image the technique of pan-sharpening is considered (Zhang, 2004). In this technique a lower resolution color image is fused on-ground with a high resolution panchromatic image to generate a high resolution color image. Three possible concepts have been identified for the MEDUSA camera system, placing both a panchromatic and a color sensitive sensor in the Albedo

focal plane as illustrated in Figure 9. All concepts use the custom designed Cypress sensor as high resolution panchromatic sensor. integration time of the sensor. The maximum expected data rate is 170 Mbits/second. On-board JPEG2000 compression is considered to fit the 20 Mbit/second data rate of the data transmitter. The restrictions on weight and the specific stratospheric environment have been taken into account during the preliminary hardware design of the payload and its subsystems. The results of this performance analysis study and the preliminary design of the hardware have shown that the toplevel requirements of MEDUSA camera can be met within the given constraints. Concept 1 Concept 2 Concept 3 Figure 9. Three possible sensor configuration concepts for pansharpening Concept 1 and 2 use respectively a frame color sensor or a set of line sensors with a larger pixel size resulting in 2 to 4 times lower resolution. Concept 3 is selected. It uses the same custom designed sensor with a Bayer color filter applied. This simplifies the Front-end electronics of the camera considerably. The color sensor resolution is the same as the panchromatic sensor resolution. However, the effective resolution of the color image will decrease due to the longer integration time (2.6 ms) needed to achieve the SNR requirement. Simulations in our image simulator have revealed that a color image obtained in concept 3 has a better quality than the color images obtained in concepts 1 and 2, and are thus more than adequate to be used for pan-sharpening. 3.6 Frame rate and readout rate calculations The minimum required overlap between subsequent images to perform block bundle adjustment in post-processing is 60%. Taking into account 70% overlap, a nominal UAV ground speed of 25 m/s and two standard deviations for the motion blur statistics, the minimum required frame rate is 0.7 frames/second for both the panchromatic and color sensor. This corresponds to a total data rate of 170 Mbits/second. Since the S-band data transmitter provides a maximum data rate of 20 Mbits/second, JPEG2000 compression is foreseen in the C&DHU. In contrast to the modest frame rate, the readout rate of the sensor is much higher. Indeed, the parasitic light sensitivity of the sensor combined with a short integration time constrains the duration of the readout time. A readout time of 33 ms results in an intensity offset of around 6%. This corresponds to a readout rate of 30 frames per second or 360 Mpixels/second. 4. CONCLUSION AND FUTURE WORK This paper has described the system performance analysis of a ultra-lightweight high-resolution multi-spectral camera taking into account the strict environmental and physical constraints imposed by a HALE UAV platform flying at 18 km altitude. Subsystem requirements have been derived and used as input for the preliminary design of the system. The camera is designed to operate at a ground resolution of 30 cm in the visible spectrum (400-650 nm), a swath of 3000m and a system MTF of at least 10% at Nyquist. To obtain an SNR of 100 at 8.00 AM at equinox a combination of a high resolution panchromatic sensor and a lower resolution color sensor is considered, using pan-sharpening in post-processing. The attitude variations of the platform impose restrictions on the The following tasks for the system performance analysis are part of future work: Introduce influence of Front-end electronics, compression and data transmission in the MTF modelling. Introduce noise impact and its correction in the image simulator. Refine performance estimations and verification based on the detailed design, assembly, integration and test of the subsystems and camera system. ACKNOWLEDGEMENTS This study is part of the ESA/PRODEX MEDUSA Phase B study (PEA C90243). REFERENCES ASPRS, 2004. ASPRS Manual of photogrammetry, 5 th edition, 2004, USA, ISBN 1-57083-071-1 Eoportal, 2006. http://directory.eoportal.org/pres_ikonos2block1.html (accessed 5 Oct. 2006) Fried, D.L., 1966. Optical resolution through a randomly inhomogeneous medium for very long and very short exposures, J. Opt. Soc. Am. 56, pp. 1372-1384, Oct. 1966. Jacobson, D., 2006. http://www.photo.net/photo/optics/lenstutorial (accessed 5 Oct. 2006) Koren, N., 2001. http://www.normankoren.com/tutorials/mtf.html (accessed, 30 June 2006) Leberl, F., Gruber, M., 2005. Ultracam-D: Understanding some noteworthy capabilities. In: Photogrammetric Week 05, Dieter Fritsch (Editor), Stuttgart, Sept. 2005. Scott, P.W., 2004. DigitalGlobe,Inc., presentation QuickBird On-Orbit Spatial Image Quality Assessment, ASPRS 2004, Denver, Colorado, USA. Smith, W.J., 2000. Modern Optical Engineering, SPIE Press Mc Graw-Hill, third edition, 2000, USA, ISBN 0-07-136360-2 Vexcel, 2006. http://www.vexcel.com (accessed 5 Oct. 2006) Zhang, Y., 2004. Understanding Image fusion, Photogrammetric engineering and remote sensing, June 2004, pp. 658-661.