Wavefront sensing for adaptive optics

Similar documents
Wavefront sensing for adaptive optics

Wavefront Sensing In Other Disciplines. 15 February 2003 Jerry Nelson, UCSC Wavefront Congress

Lecture 7: Wavefront Sensing Claire Max Astro 289C, UCSC February 2, 2016

Adaptive Optics for LIGO

Adaptive Optics lectures

Effect of segmented telescope phasing errors on adaptive optics performance

NGAO NGS WFS design review

Explanation of Aberration and Wavefront

Why is There a Black Dot when Defocus = 1λ?

KAPAO: Design and Assembly of the Wavefront Sensor for an Adaptive Optics Instrument

Vision Research at. Validation of a Novel Hartmann-Moiré Wavefront Sensor with Large Dynamic Range. Wavefront Science Congress, Feb.

Optical design of a high resolution vision lens

Proposed Adaptive Optics system for Vainu Bappu Telescope

Ron Liu OPTI521-Introductory Optomechanical Engineering December 7, 2009

Cardinal Points of an Optical System--and Other Basic Facts

Geometrical Optics for AO Claire Max UC Santa Cruz CfAO 2009 Summer School

Wavefront control for highcontrast

Modeling the multi-conjugate adaptive optics system of the E-ELT. Laura Schreiber Carmelo Arcidiacono Giovanni Bregoli

Figure 7 Dynamic range expansion of Shack- Hartmann sensor using a spatial-light modulator

Shaping light in microscopy:

APPLICATION NOTE

Adaptive Optics Phoropters

AgilEye Manual Version 2.0 February 28, 2007

EE119 Introduction to Optical Engineering Fall 2009 Final Exam. Name:

Comparative Performance of a 3-Sided and 4-Sided Pyramid Wavefront Sensor. HartSCI LLC, 2555 N. Coyote Dr. #114, Tucson, AZ

High contrast imaging lab

The Wavefront Control System for the Keck Telescope

Reflectors vs. Refractors

Multi aperture coherent imaging IMAGE testbed

CHARA AO Calibration Process

Ocular Shack-Hartmann sensor resolution. Dan Neal Dan Topa James Copland

Aberrations and adaptive optics for biomedical microscopes

Lecture 2: Geometrical Optics. Geometrical Approximation. Lenses. Mirrors. Optical Systems. Images and Pupils. Aberrations.

12.4 Alignment and Manufacturing Tolerances for Segmented Telescopes

Lecture 2: Geometrical Optics. Geometrical Approximation. Lenses. Mirrors. Optical Systems. Images and Pupils. Aberrations.

EE119 Introduction to Optical Engineering Spring 2003 Final Exam. Name:

PYRAMID WAVEFRONT SENSOR PERFORMANCE WITH LASER GUIDE STARS

3.0 Alignment Equipment and Diagnostic Tools:

Design parameters Summary

DESIGN NOTE: DIFFRACTION EFFECTS

Lecture 4: Geometrical Optics 2. Optical Systems. Images and Pupils. Rays. Wavefronts. Aberrations. Outline

A prototype of the Laser Guide Stars wavefront sensor for the E-ELT multi-conjugate adaptive optics module

MAORY E-ELT MCAO module project overview

Bruce Macintosh for the GPI team Presented at the Spirit of Lyot conference June 7, 2007

Computer Generated Holograms for Optical Testing

Chapter Ray and Wave Optics

1.6 Beam Wander vs. Image Jitter

Effect of segmented telescope phasing errors on adaptive optics performance

Improving techniques for Shack-Hartmann wavefront sensing: dynamic-range and frame rate


Chapter 36. Image Formation

Metrology and Sensing

Chapter 36. Image Formation

AY122A - Adaptive Optics Lab

Point Spread Function. Confocal Laser Scanning Microscopy. Confocal Aperture. Optical aberrations. Alternative Scanning Microscopy

Optical Components for Laser Applications. Günter Toesko - Laserseminar BLZ im Dezember

Chapter 25. Optical Instruments

MALA MATEEN. 1. Abstract

INTRODUCTION TO ABERRATIONS IN OPTICAL IMAGING SYSTEMS

Puntino. Shack-Hartmann wavefront sensor for optimizing telescopes. The software people for optics

Optimization of Existing Centroiding Algorithms for Shack Hartmann Sensor

Optical System Design

Light gathering Power: Magnification with eyepiece:

Focal Plane and non-linear Curvature Wavefront Sensing for High Contrast Coronagraphic Adaptive Optics Imaging

Wavefront sensing by an aperiodic diffractive microlens array

Designing Adaptive Optics Systems

Wavefront sensor design for NGAO: Assumptions, Design Parameters and Technical Challenges Version 0.1

A Ground-based Sensor to Detect GEOs Without the Use of a Laser Guide-star

Long-Range Adaptive Passive Imaging Through Turbulence

Big League Cryogenics and Vacuum The LHC at CERN

Null Hartmann test for the fabrication of large aspheric surfaces

VC 11/12 T2 Image Formation

Criteria for Optical Systems: Optical Path Difference How do we determine the quality of a lens system? Several criteria used in optical design

Physics 431 Final Exam Examples (3:00-5:00 pm 12/16/2009) TIME ALLOTTED: 120 MINUTES Name: Signature:

DESIGNING AND IMPLEMENTING AN ADAPTIVE OPTICS SYSTEM FOR THE UH HOKU KE`A OBSERVATORY ABSTRACT

Calibration of AO Systems

The Formation of an Aerial Image, part 3

Optical Design of. Microscopes. George H. Seward. Tutorial Texts in Optical Engineering Volume TT88. SPIE PRESS Bellingham, Washington USA

What is Wavefront Aberration? Custom Contact Lenses For Vision Improvement Are They Feasible In A Disposable World?

Paper Synopsis. Xiaoyin Zhu Nov 5, 2012 OPTI 521

J. C. Wyant Fall, 2012 Optics Optical Testing and Testing Instrumentation

An Update on the Installation of the AO on the Telescopes

WaveMaster IOL. Fast and accurate intraocular lens tester

Image Formation. Light from distant things. Geometrical optics. Pinhole camera. Chapter 36

Breadboard adaptive optical system based on 109-channel PDM: technical passport

Computational Cameras. Rahul Raguram COMP

ECEG105/ECEU646 Optics for Engineers Course Notes Part 4: Apertures, Aberrations Prof. Charles A. DiMarzio Northeastern University Fall 2008

Non-adaptive Wavefront Control

Some of the important topics needed to be addressed in a successful lens design project (R.R. Shannon: The Art and Science of Optical Design)

Dynamic Phase-Shifting Electronic Speckle Pattern Interferometer

Shack Hartmann Sensor Based on a Low-Aperture Off-Axis Diffraction Lens Array

OPTINO. SpotOptics VERSATILE WAVEFRONT SENSOR O P T I N O

Properties of Structured Light

Design of wide-field imaging shack Hartmann testbed

Hartmann-Shack sensor ASIC s for real-time adaptive optics in biomedical physics

Using molded chalcogenide glass technology to reduce cost in a compact wide-angle thermal imaging lens

ABSTRACT 1. INTRODUCTION

PHY 431 Homework Set #5 Due Nov. 20 at the start of class

FRAUNHOFER AND FRESNEL DIFFRACTION IN ONE DIMENSION

PROCEEDINGS OF SPIE. Measurement of low-order aberrations with an autostigmatic microscope

Transcription:

Wavefront sensing for adaptive optics Brian Bauman, LLNL This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

Acknowledgments Wilson Mizner : "f you steal from one author it's plagiarism; if you steal from many it's research." Thanks to: Richard Lane, Lisa Poyneer, Gary Chanan, Jerry Nelson; now add Marcos van Dam

Outline Wavefront sensing Shack-Hartmann Hartmann test History of Shack-Hartmann WFS Centroid estimation SH WFS design Pyramid Curvature Not covered direct phase/interferometric measurements Phase retrieval Topics are covered with a bit of the optical engineer s point of view

Hartmann test Before there was Shack, there was Hartmann (1900, 1904 (in German)). Used for testing figure of optics reference spots wavefront screen Detector (film/ccd)

Hartmann test Before there was Shack, there was Hartmann (1900, 1904 (in German)). Used for testing figure of optics z W( x, y) Δy Slope= Δy/z wavefront screen Detector (film/ccd)

Hartmann masks Originally, polar array of holes to sample aperture; suffered from sparse sampling at outer edge (or over-dense sampling near center), radial patterns hard to see Holes sized according to power, diffraction size Helical pattern for testing Lick 3-meter mirror (Mayall & Vasilevskis, 1960) Square grid was introduced in early 70 s Malacara

Hartmann test Before there was Shack, there was Hartmann (1900, 1904 (in German)). Used for testing figure of optics z W( x, y) Δy slope= Δy/z wavefront screen Detector (film/ccd)

Hartmann test Before there was Shack, there was Hartmann (1900, 1904 (in German)). z W( x, y) Δy Slope= Δy/z wavefront lenslets Detector (film/ccd)

History of Shack s/platt s modifications Original application was for measuring atmospheric distortions to deconvolve images of satellites Replaced holes with lenslets to maximize throughput (application was measuring atmosphere-distorted wavefronts) and to reduce spot size Made lenslets by polishing glass with 150-mm-long cylindrical nylon mandrel sliding on steel shaft until cylindrical divot was desired width (1 mm), then shifted the mandrel by the lenslet pitch Cylinders polished to λ/20 Used glass cylinders as master in molding process Plexiglass was molded between crossed cylindrical sets to form spherical lenslets on square grid Molds formed in Platt s kitchen oven, softening plexiglass until it slumped between masters; trimmed plexiglass with electric kitchen knife Platt and Shack, J. Refractive Surgery, vol. 17, p. S573 S577 (2001)

Shack-Hartmann spots

Shack-Hartmann spots 45-degree astigmatism

Dot relay Lenslets available generally only in fixed sizes; CCD pixels available in fixed sizes; but can adapt lenslet pitch to CCD pixel pitch via relay, often two-lens 4-f telescope for low aberrations/geometric distortion; Relay often necessary anyway because of short lenslet focal lengths and clearance issues Modeled as separate imaging system; dots are objects; entrance pupil is at infinity (telecentric) There is rarely any optical design advantage in modeling the lenslet array as such. Divide design into before-lenslets and after-lenslets. Dot plane Good place for filters CCD plane

Dot relay design considerations Once wavefront is sampled by lenslets, the game is over; the wavefront measurement has been made. The relay need only to not blur spot too much, and not introduce unacceptable distortion, which is interpreted as a wavefront error by the WFS. f# is generally slow (e.g., ~ f/20), so re-imaging dots is not difficult. For quad-cell systems, spacing between lenses needs to be perturbed from 4f otherwise there is no dot magnification adjustment possible There is a pupil wrt imaging the dots this is a good place for filters as it after the measurement of the wavefront and affects all subapertures equally. Dot plane Good place for filters CCD plane

Spot size/subaperture size Spot size ~λ/d, where d is the subaperture size. Typically, d is on the order of the actuator pitch (often exactly the actuator pitch Fried geometry), and is on the order of r 0 at the science wavelength. For λ 0.8μ and d=40 cm, the spot size is approximately 0.8 μ/0.4m=2 μrad=0.4 arcsec Spot size trade-off bigger subapertures => more light, better SNR in centroid measurement, but poorer fit to wavefront. f subapertures are too small, then spot size increases due to diffraction, degrades spot centroid estimate (proportional to spot size) n example above, 5% spot-size displacement => 0.1 μrad => 0.1 μrad * 0.4 m = 40 nm nm tilt across subaperture

Plate scale Plate scale refers to the size in arcsec of a pixel on the CCD of the SH WFS, often ~1-2 arcsec/pixel. Plate scale trade-off Bigger pixels (in arcsec) => more range without crossover between subapertures, but lets in more sky background Could use more pixels per subaperture, but that increases noise in estimate due to read noise/dark current

Typical vision science WFS Lenslets CCD Many pixels per subaperture

Typical Astronomy WFS Former Keck AO WFS sensor 2 mm 21 μ pixels 3x3 pixels/subap 200 μ lenslets relay lens CCD 3.15 reduction

Centroiding Once you have generated spots, how do you determine their positions? The performance of the Shack-Hartmann sensor depends on how well the displacement of the spot is estimated. The displacement is usually estimated using the centroid (center-of-mass) estimator. This is the optimal estimator for the case where the spot is Gaussian distributed and the noise is Poisson. s x = x ( x, y) ( x, y) s y = y ( x, y) ( x, y)

Centroiding noise Due to read noise and dark current, all pixels are noisy. Pixels far from the center of the subaperture are multiplied by a large number: s x = x ( x, y) x = { L, 3, 2, 1,0,1,2,3, L} The pixels with the most leverage on the centroid estimate are the dimmest (therefore, the pixels with the least information), and there are lots of dim pixels The more pixels you have, the noisier the centroid estimate!

Weighted centroid The noise can be reduced by windowing the centroid:

Weighted centroid Can use a square window, a circular window: Or better still, a tapered window s x = xw( x, y) ( x, y) s y = yw( x, y) ( x, y)

Correlation (matched filtering) Find the displacement of the image that gives the maximum correlation: ( s, s ) = argmax( w( x, y) ( x, y)) x y =

Correlation (matched filtering) Noise is independent of number of pixels Much better noise performance for many pixels Estimate is independent of uniform background errors Estimate is relatively insensitive to assumed image.

n astronomy, wavefront slope measurements are often made using a quad cell (2x2 pixels) Quad cells are faster to read and to compute the centroid and less sensitive to noise Quad cells 4 3 2 1 4 3 2 1 s x + + + + = 4 3 2 1 4 3 2 1 s y + + + + =

Quad cells The estimated centroid position is linear with displacement only over a small region (small dynamic range) Sensitivity is proportional to spot size Estimated centroid position vs. displacement for different spot sizes Centroid estimated position Displacement

Denominator-free centroiding When the photon flux is very low, noise in the denominator increases the centroid error Centroid error can be reduced by using the average value of the denominator s 1 + 2 3 4 = x E[ + + + ] 4 1 2 3 s 1 2 + 3 4 = y E[ + + + ] 4 1 2 3

Laser guide elongation Shack-Hartmann subapertures see a line not a spot

LGS elongation at Keck Laser projected from right

A possible solution for LGS Radial format CCD Arrange pixels to be at same orientation as spots Currently testing this design for TMT elongation laser

Dynamic refocusing for pulsed lasers Powered mirror on mechanical resonator (U of A) Segmented MEMS, one segment per subaperture (Bauman; Baranec) Rotating phase plates (e.g., Alvarez lens) (Bauman)

Problems with SH WFS Spot size is large (~ λ/d) Crucial measurement is made at junction between pixel boundaries, which are indistinct (has been reported at ~1/3 pixel charge diffusion) Worst of all worlds: photons near knife-edge generate all the noise and none of the signal!

Foucault knife-edge test Foucault (1858, 1859 (in French)) Knife-edge test for perfect lens (top), and one with spherical aberration (bottom). At right are observer views of pupil in each case. An irregular mirror tested with knifeedge test

Foucault test with mirror

Pyramid WFS Pyramid is simultaneous implementation of 4 Foucault knife-edge measurements SH WFS divides aperture into subapertures (via lenslets), then field into quadrants (via pixels) PWFS does in reverse order: pyramid divides field into quadrants (via pyramid) then aperture into subapertures (via pixels) pyramid field lens pupils with CCD pixels demarking subapertures incoming beam CCD at pupil plane image plane

PWFS details Pyramids are naturally quite small: Size of pyramid ~ n * (λ*f#), where n is # of subapertures (natural spatial filter) Pyramids have tight fabrication tolerances: Edge precision is a fraction of the full-aperture diffraction spot size (e.g., λ=1μ, f/15 sub-micron precision required. Can make beam slower to relax edge requirements, but at cost of length. Can be made of glass, using cemented facets. Difficult to make sharp edges Can use lenslet-based PWFS (coming up) Note advantage: if edges of pyramid can be sharp, then centroid measurement can be quite precise; indistinct CCD pixel boundaries relegated to subaperture division not crucial Also interesting: as wavefront slopes becomes small, the PWFS becomes a direct phase measuring device

SH WFS vs. PWFS Geometrically, identical just remapping of pixels. Diffractive advantage appears in high- Strehl regime.

Pyramid wave-front sensor non-linearity When the aberrations are large (e.g., defocus below), the pyramid sensor is very non-linear (reaches saturation). 4 pupil images x- and y-slopes estimates.

Modulation of pyramid sensor Without modulation: Linear over spot width With modulation: Linear over modulation width

Another pyramid implementation: Pyramid + lens = 2x2 lenslet array Lenslets are inexpensive and easily replicated. The right manufacturing technique produces sharp boundaries between lenslets (where all the action is). pyramid field lens lenslets Bauman dissertation

Brightening of rim is real effect PWFS is not quite a slope detector, but a derivative detector (effect also seen in knife-edge tests) There is a large derivative (in amplitude) at the edge of an aperture Pupils should not be too close to avoid contamination between pupil images mage of PWFS Johnson, et al., 2006

Curvature sensing -z mage 2 Aperture Wave-front at aperture z mage 1

Curvature sensing Developed by Roddier for AO in 1988. Linear relationship between the curvature in the aperture and the normalized intensity difference: Broadband light helps reduce diffraction effects. Tends to be used in lowerorder systems (i.e., fewer subapertures/actuators, because of higher error propagation z = f f l ( f l) l Aperture Defocused image 1 Defocused image 2

Curvature sensing W W z =. 2 where is the intensity, W is the wavefront and z is the direction of propagation, we obtain a linear, first-order approximation, which is a Poisson equation with Neumann boundary conditions. Using the irradiance transport equation, W z W z + = +. 2 1 2 1 2

Solution at the boundary ) ( ) ( ) ( ) ( 2 1 2 1 x x x x zw R x H zw R x H zw R x H zw R x H + + + = + + - 1 2 1-2 f the intensity is constant at the aperture,

Solution inside the boundary 1 1 + 2 2 = z( W xx + W yy ) Curvature There is a linear relationship between the signal and the curvature The sensor is more sensitive for large effective propagation distances

Curvature sensing As the propagation distance, z, increases, sensitivity increases. Spatial resolution decreases. Diffraction effects increase. 1 1 + 2 2 = z( W xx + W yy ) The relationship between the signal, ( 1-2 )/( 1 + 2 ) and the curvature, W xx + W yy, becomes non-linear

Curvature sensing Practical implementation uses a variable curvature mirror (to obtain images below and above the aperture) and a single detector.

Curvature sensor subapertures Measure intensity in each subaperture with an avalanche photo-diode (APD) Detect individual photons no read noise