Quantitative Estimation of Vvariability in the Underwater Radiance Distribution (RadCam)

Similar documents
metcon meteorologieconsultgmbh, Instruments for Atmospheric Research W1aa_Feb_2017_1.doc 1 -

LI-193 Spherical Quantum Sensor

ECEN. Spectroscopy. Lab 8. copy. constituents HOMEWORK PR. Figure. 1. Layout of. of the

The below identified patent application is available for licensing. Requests for information should be addressed to:

ECEN 4606, UNDERGRADUATE OPTICS LAB

BTS2048-BS. Product tags: VIS, Spectral Data, Industrial Applications, Laser.

RADIOMETRIC CALIBRATION

Development of 2 Total Spectral Radiant Flux Standards at NIST

A Kalman-Filtering Approach to High Dynamic Range Imaging for Measurement Applications

Spatially Resolved Backscatter Ceilometer

Comprehensive Vicarious Calibration and Characterization of a Small Satellite Constellation Using the Specular Array Calibration (SPARC) Method

LI-192 Underwater Quantum Sensor

Detailed Scientific Barrier Filter Discussion

Applications of Optics

TSBB09 Image Sensors 2018-HT2. Image Formation Part 1

GEO-SolarSIM-D2 and SunTracker-2000/3000

ISS-30-VA. Product tags: Integrating Sphere Source. Gigahertz-Optik GmbH 1/5

OPAL Optical Profiling of the Atmospheric Limb

Integrating Spheres. Why an Integrating Sphere? High Reflectance. How Do Integrating Spheres Work? High Damage Threshold

specification display & lighting hera spectrometer

Imaging Photometer and Colorimeter

APPLICATION NOTE. Computer Controlled Variable Attenuator for Tunable Lasers. Technology and Applications Center Newport Corporation

Application Note (A16)

Optical design of a high resolution vision lens

Difrotec Product & Services. Ultra high accuracy interferometry & custom optical solutions

ADVANCED OPTICS LAB -ECEN 5606

ECEN 4606, UNDERGRADUATE OPTICS LAB

Introduction to the operating principles of the HyperFine spectrometer

ULS24 Frequently Asked Questions

NIST Agency Report May 2012 OUTLINE. The case for traceability NMI capabilities A view to the future the HIP Current/recent NIST activities

Geometric Dilution of Precision of HF Radar Data in 2+ Station Networks. Heather Rae Riddles May 2, 2003

Sensor Fusion Enables Comprehensive Analysis of Laser Processing in Additive Manufacturing

Intorduction to light sources, pinhole cameras, and lenses

Image Capture TOTALLAB

White Paper: Modifying Laser Beams No Way Around It, So Here s How

Spectral Reflectance Sensor SRS-NDVI

Beamscope-P8 Wavelength Range. Resolution ¼ - 45 ¼ - 45

Department of Mechanical and Aerospace Engineering, Princeton University Department of Astrophysical Sciences, Princeton University ABSTRACT

DEFENSE APPLICATIONS IN HYPERSPECTRAL REMOTE SENSING

Keysight Technologies Optical Power Meter Head Special Calibrations. Brochure

RGB Laser Meter TM6102, RGB Laser Luminance Meter TM6103, Optical Power Meter TM6104

Information & Instructions

FRAUNHOFER AND FRESNEL DIFFRACTION IN ONE DIMENSION

Instruction manual for Ocean Optics USB4000 and QE65 Pro spectroradiometers

SPECTRAL IRRADIANCE DATA

Radiometric Solar Telescope (RaST) The case for a Radiometric Solar Imager,

Coherent Laser Measurement and Control Beam Diagnostics

Measurements of Wave-Induced Fluctuations in Underwater Radiance under Various Surface Boundary Conditions

Solid State Luminance Standards

Chapter 5 Nadir looking UV measurement.

Basler ral km. Camera Specification. Measurement protocol using the EMVA Standard 1288 Document Number: BD Version: 01

Optical Micrometer Measurement System Product Description

Unprecedented Resolution and Accuracy For Camera & Image Sensor Calibration

Speckle free laser projection

Vixar High Power Array Technology

Application Note (A11)

6. Very low level processing (radiometric calibration)

Optoliner NV. Calibration Standard for Sighting & Imaging Devices West San Bernardino Road West Covina, California 91790

UV-VIS-IR Spectral Responsivity Measurement System for Solar Cells

Acoustic Measurements of Tiny Optically Active Bubbles in the Upper Ocean

Puntino. Shack-Hartmann wavefront sensor for optimizing telescopes. The software people for optics

Basler aca640-90gm. Camera Specification. Measurement protocol using the EMVA Standard 1288 Document Number: BD Version: 02

Computer Generated Holograms for Optical Testing

SMART LASER SENSORS SIMPLIFY TIRE AND RUBBER INSPECTION

Optical Fiber Technology. Photonic Network By Dr. M H Zaidi

Vision. The eye. Image formation. Eye defects & corrective lenses. Visual acuity. Colour vision. Lecture 3.5

Camera Calibration Certificate No: DMC III 27542

NIRCam optical calibration sources

The ASTRI SST-2M Illuminator

BASLER A601f / A602f

Effect of Beam Size on Photodiode Saturation

BROADCAST ENGINEERING 5/05 WHITE PAPER TUTORIAL. HEADLINE: HDTV Lens Design: Management of Light Transmission

Camera Requirements For Precision Agriculture

Beam Profiling. Introduction. What is Beam Profiling? by Michael Scaggs. Haas Laser Technologies, Inc.

The Standard for over 40 Years

Application of GIS to Fast Track Planning and Monitoring of Development Agenda

ADVANCED OPTICS LAB -ECEN Basic Skills Lab

Laser Beam Analysis Using Image Processing

Basler aca gm. Camera Specification. Measurement protocol using the EMVA Standard 1288 Document Number: BD Version: 01

Compact High Resolution Imaging Spectrometer (CHRIS) siraelectro-optics

Bringing Answers to the Surface

DIGITAL IMAGING. Handbook of. Wiley VOL 1: IMAGE CAPTURE AND STORAGE. Editor-in- Chief

Laser Speckle Reducer LSR-3000 Series

Preliminary Characterization Results: Fiber-Coupled, Multi-channel, Hyperspectral Spectrographs

Figure 1 HDR image fusion example

AgilOptics mirrors increase coupling efficiency into a 4 µm diameter fiber by 750%.

WaveMaster IOL. Fast and accurate intraocular lens tester

Quantum Efficiency Measurement System with Internal Quantum Efficiency Upgrade

Real-Time Scanning Goniometric Radiometer for Rapid Characterization of Laser Diodes and VCSELs

Image Formation and Capture. Acknowledgment: some figures by B. Curless, E. Hecht, W.J. Smith, B.K.P. Horn, and A. Theuwissen

How is the Digital Image Generated? Image Acquisition Devices

Development of a new multi-wavelength confocal surface profilometer for in-situ automatic optical inspection (AOI)

Thomas G. Cleary Building and Fire Research Laboratory National Institute of Standards and Technology Gaithersburg, MD U.S.A.

EE-527: MicroFabrication

HR2000+ Spectrometer. User-Configured for Flexibility. now with. Spectrometers

Camera Requirements For Precision Agriculture

GEOMETRICAL OPTICS AND OPTICAL DESIGN

Warren J. Smith Chief Scientist, Consultant Rockwell Collins Optronics Carlsbad, California

Evaluation of Scientific Solutions Liquid Crystal Fabry-Perot Etalon

Basler aca km. Camera Specification. Measurement protocol using the EMVA Standard 1288 Document Number: BD Version: 03

Transcription:

Quantitative Estimation of Vvariability in the Underwater Radiance Distribution (RadCam) Marlon R. Lewis Satlantic, Inc. Richmond Terminal, Pier 9, 3481 North Marginal Road Halifax, Nova Scotia, Canada B3K 5X8 phone: (902) 492-4780 fax: (902) 492-4781 email: marlon@satlantic.com Scott D. McLean Satlantic, Inc. Richmond Terminal, Pier 9, 3481 North Marginal Road Halifax, Nova Scotia, Canada B3K 5X8 phone: (902) 492-4780 fax: (902) 492-4781 email: marlon@satlantic.com Award Number: N00014-07-C0139 http://www.satlantic.com LONG-TERM GOALS A significant source of uncertainty in the prediction of the apparent optical properties of the ocean is the geometrical distribution of the radiance field and its variation with respect to time and space; this uncertainty directly affects attempts to use measurements of reflectance and attenuation for the diagnosis of ocean constituents. Uncertainties in the time and depth dependent variations in the radiance distribution, and their sources of variation, propagate as well to the prediction of the performance of new imaging systems such as the virtual periscope. The problem starts at the sea surface, where the generally unknown sky radiance distribution, coupled with a roughened air-sea interface, plays a major role in the transmission of sun and sky radiance to below the surface. In the ocean interior, the volume scattering function, and the absorption coefficient alter the radiance distribution in both the forward and backward direction; in the perhaps usual situation of multiple scattering, the uncertainty in the radiance distribution becomes large. In optically shallow areas, non- Lambertian bottom reflectances add to the uncertainty. Our long-term goal is to develop and deploy a relatively simple means for the measurement of the full radiance distribution, which could be routinely deployed by the optical oceanographic community. A further side benefit would be that many of the measurements currently made, such as planar and scalar irradiance, angle-dependent Q factor etc., could be made by various integration operations on the measured radiance field rather than with mechanical diffusers. The potential interferences of various deployment platforms (e.g. shading, reflectances by ships, buoys and towers) could be measured directly rather than inferred based on inaccurate assumptions about the underwater radiance distribution. A direct confirmation of the asymptotic radiance distribution can be made. Finally, high quality quantitative (and radiometrically calibrated) measurements of the radiance distribution, and their time and depth derivatives, can in principle (but not yet in practice) be used to estimate all the inherent optical properties (both absorption and volume scattering coefficient) and as well the nature of the air-sea interface. 1

OBJECTIVES The Radiance Camera or RadCam project is part of the Radiance in a Dynamic Ocean (RaDyO) program. The primary objective is to create a camera that can record the spatial radiance distribution at the ocean surface and at depth. The proposed instrument will be uniquely capable of resolving both the downwelling and upwelling radiance distribution and its variation with depth, time and wavelength ( L ( z,t, θ, φ, λ) ); from these measurements, the apparent optical properties E D, E U, E o, E ou and E od are computed by integration. The distribution functions (e.g. the average cosines) are computed directly, as are the various diffuse attenuation coefficients and reflectances. The fully-specified radiance field therefore provides all the pertinent information to derive not only the apparent optical properties, but the inherent optical properties: the absorption coefficient and, in principle by inversion, the volume scattering function. An instrument capable of this measurement to the necessary accuracy, resolution, and noise characteristics could, again in principle, replace all or most of the optical instruments currently deployed today.. APPROACH While radiance cameras have been built before, they have not been able to image the sun at the surface due to the very high scene dynamic range. RadCam will take advantage of recent developments in high-dynamic range (HDR) CMOS imaging arrays. These arrays were developed for science, surveillance, and automotive applications. Traditional CCD arrays are linear, limiting the dynamic range that can be achieved. These HDR CMOS arrays use a number of different methods to produce a nonlinear response function, giving scene dynamic ranges of up to 120 db or 6 decades. WORK COMPLETED In the first year of this project we considered several possible cameras and imaging arrays. We tested two candidate cameras/arrays and selected one for RadCam. Measurements have shown that it can achieve a scene dynamic range of 6 decades, and an impressive system dynamic range of nearly 10 decades. Three instruments are being designed as part of this project. The first is a reference camera that will be mounted on deck. The second is a logging-type instrument that can be mounted on a Bluefin AUV or an ROV. The third is a profiler that sends data to the surface for real-time processing. The first two cameras are upward looking only (i.e. they record downwelling radiance) while the profiler has both an upwelling and downwelling camera. This allows it to measure radiance over the entire sphere around the instrument. At this time, only the first two cameras are being built, though the design considers the profiler. RESULTS Optics A fisheye lens with a specified 185 degree field of view was selected for use in the camera. A custom optical system, shown in Figure 1, was designed to reduce the image size produced by the fisheye to fit 2

totally within the CMOS array. The system also contains a bandpass filter centered at 555 nm with a bandwidth of 20 nm. Figure 1: Custom optical system used to reduce image size and spectrally filter the image. It includes a fisheye lens, field lens, filter, a pair of achromat doublets and the CMOS array. The geometric mapping of the optical system is shown in Figure 2. Most fisheye lenses use an equidistant projection, where the field angle maps linearly to a radial position in the image. The measurements show a field angle of 190 degrees mapped to an image 466 pixels in diameter. This corresponds to an angular resolution of about 0.4 degrees/pixel, better than the 1 degree/pixel specified at the outset of the project. The rolloff of the optical system is shown in Figure 3. The measurements show a drop in throughput to 92% at a field angle of 95 degrees. This drop is almost entirely due to the fisheye lens. Thus the custom optical system exhibits excellent performance. Radius [pixels] 250 200 150 100 50 0-50 -100-150 -200-250 y = 2.4948x - 0.6983 R 2 = 0.9999-100 -80-60 -40-20 0 20 40 60 80 100 Field Angle [degrees] Figure 2: Graph of the geometric projection of the optical system (image radial position versus field angle) showing that the relationship is very linear. 3

1.0 Radiance [normalized] 0.8 0.6 0.4 0.2 0.0 0 10 20 30 40 50 60 70 80 90 100 Field Angle [degrees] Figure 3: Graph showing the rolloff of the optical system. The rolloff is negligible out to about 60 degrees field angle and drops to about 92% at 95 degrees. Figure 4 and Figure 5 show the assembled optical system, including the camera electronics. Figure 4: Side view of the camera assembly showing the fisheye lens, custom optical system, a translator for accurately positioning the image on the array, and camera electronics. The whole assembly is 16 cm long. 4

Figure 5: Picture of the backside view of the custom camera circuit board. The metallic box is the fiber transceiver. Electronics Custom electronics were required to serialize the video stream from the camera (Profiler and ROV/AUV) and transmit it over a fiber optic cable to a computer on-deck for real time processing. The serialized data rate is approximately 500 Mb/s. At the surface the data is deserialized and sent to a framegrabber. The framegrabber assembles the images and transfers them to memory for processing. The boards have been designed, built, and tested, with the exception of the connection to the framegrabber. The custom camera board was designed such that it can be used in any of the instruments. The instruments were designed to handle data from ancillary sensors including tilt/heading and pressure. The profiler will include a full suite of instruments including integrating radiometers, attenuation and backscatter meters and others. The data from these instruments is interleaved with the video data stream. The ROV/AUV camera has no connection to a deck unit and logs data onboard. A PC104-style computer captures and stores the video and ancillary sensor data. On retrieval of the instrument, the data is offloaded using a high-speed ethernet connection. The only real-time processing performed onboard is determining which exposure to use. Software Much of the instrument control software has already been written for the cameras. It is written in Java with interfaces to native C code for specific third-party hardware. Java simplifies the development of a 5

user interface including displaying live video and camera control. It will interface with existing Satlantic code to display and log ancillary sensor data. Images will be processed in real time, though the maximum frame rate will depend on the processing power of the computer and still needs to be determined. Processing includes applying calibration coefficients to each pixel, compensating for tilt/heading, and calculating apparent and inherent optical properties (see Figure 6). The software must also determine when a change in exposure is required. This last task is not trivial and has not yet been implemented. Figure 6: Test image (left) and output (right) of the tilt/heading compensation algorithm. User supplied tilt and heading information were used instead of real sensor data, for testing purposes. The compensated image shows a crescent shaped area that corresponds to a region outside the fisheye field of view. Calibration The calibration of the CMOS array is one of the most challenging parts of the RadCam project. Unlike CCDs which have a quite uniform response over the array, every pixel in a CMOS device has a slightly different response (grayscale versus radiance). This is because each pixel in the CMOS array contains its own electronics. Thus every pixel in the array must be individually calibrated. Particular challenges for calibrating RadCam are the very large dynamic range and the very high peak radiance (7*10 6 uw/cm 2 /nm/sr). Two methods of measuring the response curves were used. The first is a standard method that uses a NIST traceable FEL lamp, a rail, and neutral density filters. Images are captured from the array at several distances from the lamp. The limited power and point source nature of the FEL lamp do not allow it to be used for the highest end of the response curve. An arc lamp was tested but a beam of sufficient spatial uniformity could not be produced. We now use a HeNe laser as the source and scan the array behind it. While the beam from a HeNe is Gaussian, it was found to be sufficiently uniform over a small spot (few tens of microns in diameter) that it could be used for calibration. During development of the calibration procedure a nonlinearity in the array 6

was observed. At high radiances the grayscale response of the array can saturate and then decrease an effect the manufacturer describes as eclipsing. On-chip circuitry is used to combat the problem and works well when small numbers of pixels are brightly illuminated. When large numbers of pixels are brightly illuminated the problem is still observed. Only solar and nearby pixels should reach these radiance levels, so the final images should be accurate, though some verification will be performed. Software was written to semi-automate data collection during calibration. It takes care of scanning the array behind the HeNe, isolating relevant data, and averaging frames together to reduce temporal noise. Software was also written to process the large calibration data sets. The program takes care of fitting together data runs, sectioning the nonlinear response curves, and fitting polynomials and hyperbolics to the sections. A final calibration coefficient file is about 30 MB per exposure setting. 800 700 600 Grayscale (DN) 500 400 300 200 100 Series1 Series2 Series3 Series4 Series5 Series6 Series7 Series8 Series9 Series10 Series11 0 0.0001 0.001 0.01 0.1 1 10 Relative Radiance Figure 7: Calibration data showing runs, fitted together to produce an overall response curve. At the left end, there are some fitting problems believed to be due to stray light during data capture. Data for the high-radiance end of this exposure is not shown. While the system will operate over nearly 10 decades of dynamic range, only about 6 decades can be captured using any particular exposure. Therefore, a search was performed to determine which exposure settings would be suitable. We have decided on three exposures that cover a dynamic range of more than 9 decades. The roughly measured curves are shown in Figure 6. 7

1024 896 768 640 Grayscale 512 384 256 128 0-10 -9-8 -7-6 -5-4 -3-2 -1 0 Relative Radiance [log] Figure 8: Graph showing roughly-measured response curves (grayscale versus relative radiance) for three different exposures that will be used in RadCam. The first exposure, suitable for use at the surface, has about 6 decades dynamic range. The second exposure extends from about 3 decades to nearly 9 decades. The third exposure extends from about 6.5 decades to about 9.5 decades. Mechanical Figure 9 shows the basic layout of two of the RadCam instruments; the ROV/AUV camera and the Reference camera. The ROV/AUV camera is wider to accommodate a PC104 computer, while conforming to the space limitations of a Bluefin AUV instrument bay. There are some additional components that will be in the instrument, including a pressure sensor, that are not shown. The Reference camera is shown with a dome but will be operated without the dome eliminate reflections and glare. Figure 9: Mechanical layout of the ROV/AUV camera (left) and the Reference camera (right). 8

IMPACT/APPLICATIONS The camera may have applications for various sorts of surveillance. The derivation of optical properties from measurement of the full radiance distribution may have practical applications. RELATED PROJECTS This project is embedded within the Radiance in a Dynamic Ocean (RaDyO) program, and hence is related to all projects contained therein. HONORS/AWARDS/PRIZES Lewis, M.R.: Awarded Killam Professor of Oceanography, Dalhousie University, Killam Foundation. 9