Wavefront sensing for adaptive optics

Similar documents
Wavefront sensing for adaptive optics

Wavefront Sensing In Other Disciplines. 15 February 2003 Jerry Nelson, UCSC Wavefront Congress

Lecture 7: Wavefront Sensing Claire Max Astro 289C, UCSC February 2, 2016

Adaptive Optics for LIGO

NGAO NGS WFS design review

Adaptive Optics lectures

KAPAO: Design and Assembly of the Wavefront Sensor for an Adaptive Optics Instrument

Wavefront control for highcontrast

Effect of segmented telescope phasing errors on adaptive optics performance

Optical design of a high resolution vision lens

3.0 Alignment Equipment and Diagnostic Tools:

Ron Liu OPTI521-Introductory Optomechanical Engineering December 7, 2009

Proposed Adaptive Optics system for Vainu Bappu Telescope

Geometrical Optics for AO Claire Max UC Santa Cruz CfAO 2009 Summer School

Cardinal Points of an Optical System--and Other Basic Facts

12.4 Alignment and Manufacturing Tolerances for Segmented Telescopes

A prototype of the Laser Guide Stars wavefront sensor for the E-ELT multi-conjugate adaptive optics module

Shaping light in microscopy:

Figure 7 Dynamic range expansion of Shack- Hartmann sensor using a spatial-light modulator

Reflectors vs. Refractors

Lecture 2: Geometrical Optics. Geometrical Approximation. Lenses. Mirrors. Optical Systems. Images and Pupils. Aberrations.

1.6 Beam Wander vs. Image Jitter

Why is There a Black Dot when Defocus = 1λ?

Vision Research at. Validation of a Novel Hartmann-Moiré Wavefront Sensor with Large Dynamic Range. Wavefront Science Congress, Feb.

Lecture 2: Geometrical Optics. Geometrical Approximation. Lenses. Mirrors. Optical Systems. Images and Pupils. Aberrations.

Comparative Performance of a 3-Sided and 4-Sided Pyramid Wavefront Sensor. HartSCI LLC, 2555 N. Coyote Dr. #114, Tucson, AZ

Aberrations and adaptive optics for biomedical microscopes

Ocular Shack-Hartmann sensor resolution. Dan Neal Dan Topa James Copland

EE119 Introduction to Optical Engineering Fall 2009 Final Exam. Name:

The Wavefront Control System for the Keck Telescope

AgilEye Manual Version 2.0 February 28, 2007

Explanation of Aberration and Wavefront

CHARA AO Calibration Process

High contrast imaging lab

Wavefront sensor design for NGAO: Assumptions, Design Parameters and Technical Challenges Version 0.1

Lecture 4: Geometrical Optics 2. Optical Systems. Images and Pupils. Rays. Wavefronts. Aberrations. Outline

Modeling the multi-conjugate adaptive optics system of the E-ELT. Laura Schreiber Carmelo Arcidiacono Giovanni Bregoli

EE119 Introduction to Optical Engineering Spring 2003 Final Exam. Name:

DESIGN NOTE: DIFFRACTION EFFECTS

Optical Components for Laser Applications. Günter Toesko - Laserseminar BLZ im Dezember

PHY 431 Homework Set #5 Due Nov. 20 at the start of class

Improving techniques for Shack-Hartmann wavefront sensing: dynamic-range and frame rate

Chapter 25. Optical Instruments

PYRAMID WAVEFRONT SENSOR PERFORMANCE WITH LASER GUIDE STARS

Computer Generated Holograms for Optical Testing

Chapter 36. Image Formation

Big League Cryogenics and Vacuum The LHC at CERN

Development of a Low-order Adaptive Optics System at Udaipur Solar Observatory

Bruce Macintosh for the GPI team Presented at the Spirit of Lyot conference June 7, 2007

Observational Astronomy

Adaptive Optics Phoropters

Optimization of Existing Centroiding Algorithms for Shack Hartmann Sensor

Chapter 36. Image Formation

Fiber Optic Communications

J. C. Wyant Fall, 2012 Optics Optical Testing and Testing Instrumentation

Chapter Ray and Wave Optics

Criteria for Optical Systems: Optical Path Difference How do we determine the quality of a lens system? Several criteria used in optical design

Multi aperture coherent imaging IMAGE testbed

Segmented deformable mirrors for Ground layer Adaptive Optics

ADVANCED OPTICS LAB -ECEN 5606

Puntino. Shack-Hartmann wavefront sensor for optimizing telescopes. The software people for optics

Dynamic Phase-Shifting Electronic Speckle Pattern Interferometer

Light gathering Power: Magnification with eyepiece:

AY122A - Adaptive Optics Lab

Astronomical Cameras

Metrology and Sensing

MALA MATEEN. 1. Abstract

An Update on the Installation of the AO on the Telescopes

The Formation of an Aerial Image, part 3

Designing Adaptive Optics Systems

Payload Configuration, Integration and Testing of the Deformable Mirror Demonstration Mission (DeMi) CubeSat

Design parameters Summary

A Ground-based Sensor to Detect GEOs Without the Use of a Laser Guide-star

ADVANCED OPTICS LAB -ECEN Basic Skills Lab

APPLICATION NOTE

Image Formation. Light from distant things. Geometrical optics. Pinhole camera. Chapter 36

DESIGNING AND IMPLEMENTING AN ADAPTIVE OPTICS SYSTEM FOR THE UH HOKU KE`A OBSERVATORY ABSTRACT

Subject headings: turbulence -- atmospheric effects --techniques: interferometric -- techniques: image processing

MAORY E-ELT MCAO module project overview

Optical System Design

Physics 431 Final Exam Examples (3:00-5:00 pm 12/16/2009) TIME ALLOTTED: 120 MINUTES Name: Signature:

AgilOptics mirrors increase coupling efficiency into a 4 µm diameter fiber by 750%.

Open-loop performance of a high dynamic range reflective wavefront sensor

INTRODUCTION TO ABERRATIONS IN OPTICAL IMAGING SYSTEMS

Wavefront sensing by an aperiodic diffractive microlens array

Point Spread Function. Confocal Laser Scanning Microscopy. Confocal Aperture. Optical aberrations. Alternative Scanning Microscopy

OPTINO. SpotOptics VERSATILE WAVEFRONT SENSOR O P T I N O

Null Hartmann test for the fabrication of large aspheric surfaces

FRAUNHOFER AND FRESNEL DIFFRACTION IN ONE DIMENSION

WaveMaster IOL. Fast and accurate intraocular lens tester

WaveMaster IOL. Fast and Accurate Intraocular Lens Tester

Classical Optical Solutions

Be aware that there is no universal notation for the various quantities.

Paper Synopsis. Xiaoyin Zhu Nov 5, 2012 OPTI 521

Effect of segmented telescope phasing errors on adaptive optics performance

Properties of Structured Light

Lecture 3: Geometrical Optics 1. Spherical Waves. From Waves to Rays. Lenses. Chromatic Aberrations. Mirrors. Outline

Scaling relations for telescopes, spectrographs, and reimaging instruments

Lecture Notes 10 Image Sensor Optics. Imaging optics. Pixel optics. Microlens

Will contain image distance after raytrace Will contain image height after raytrace

Transcription:

Wavefront sensing for adaptive optics Richard Dekany Caltech Optical Observatories 2009

Thanks to: Acknowledgments Marcos van Dam original screenplay Brian Bauman adapted screenplay Contributors Richard Lane, Lisa Poyneer, Gary Chanan, Jerry Nelson, and others; Elements of this presentation were prepared under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

Outline Wavefront sensing Shack-Hartmann Hartmann test History of Shack-Hartmann WFS Centroid estimation SH WFS design Pyramid Curvature Not covered here Shearing interferometers Direct phase/interferometric measurements Phase retrieval Double curvature sensing PIGS, SPLASH, L-O PWFS, etc. Topics are covered with a bit of the Optical Engineer s point of view the Boss : Springsteen :: the Optical Engineer : Bauman

Hartmann test Before there was Shack, there was Hartmann (1900, 1904 (in German)). Used for testing figure of optics reference spots wavefront screen Detector (film/ccd)

Hartmann test Before there was Shack, there was Hartmann (1900, 1904 (in German)). Used for testing figure of optics z Δy Slope= Δy/z wavefront screen Detector (film/ccd)

Hartmann masks Originally, polar array of holes to sample aperture; suffered from sparse sampling at outer edge (or over-dense sampling near center), radial patterns hard to see Holes sized according to power, diffraction size Helical pattern for testing Lick 3-meter mirror (Mayall & Vasilevskis, 1960) Square grid was introduced in early 70 s Malacara

Hartmann test Before there was Shack, there was Hartmann (1900, 1904 (in German)). Used for testing figure of optics z Δy slope= Δy/z wavefront screen Detector (film/ccd)

Shack-Hartmann test Before there was Shack, there was Hartmann (1900, 1904 (in German)). z Δy Slope= Δy/z wavefront lenslets Detector (film/ccd)

History of Shack s/platt s modifications Original application was for measuring atmospheric distortions to deconvolve images of satellites Replaced holes with lenslets to maximize throughput (application was measuring atmosphere-distorted wavefronts) and to reduce spot size Made lenslets by polishing glass with 150-mm-long cylindrical nylon mandrel sliding on steel shaft until cylindrical divot was desired width (1 mm), then shifted the mandrel by the lenslet pitch Cylinders polished to λ/20 Used glass cylinders as master in molding process Plexiglass was molded between crossed cylindrical sets to form spherical lenslets on square grid Molds formed in Platt s kitchen oven, softening plexiglass until it slumped between masters; trimmed plexiglass with electric kitchen knife Platt and Shack, J. Refractive Surgery, vol. 17, p. S573 S577 (2001)

Shack-Hartmann spots

Shack-Hartmann spots 45-degree astigmatism

Lenslets-to-CCD: The dot relay Lenslets available generally only in fixed sizes; CCD pixels available in fixed sizes; but can adapt lenslet pitch to CCD pixel pitch via relay, often two-lens 4-f telescope for low aberrations/geometric distortion; Relay often necessary anyway because of short lenslet focal lengths and clearance issues Modeled as separate imaging system; dots are objects; entrance pupil is at infinity (telecentric) There is rarely any optical design advantage in modeling the lenslet array as such. Divide design into before-lenslets and after-lenslets. Dot plane Good place for filters CCD plane

Dot relay design considerations Once wavefront is sampled by lenslets, the game is over; the wavefront measurement has been made. The relay need only to not blur spot too much, and not introduce unacceptable distortion, which is interpreted as a wavefront error by the WFS. f# is generally slow (e.g., f/20 to f/50), so re-imaging dots is not difficult. For quad-cell systems, spacing between lenses needs to be perturbed from 4f otherwise there is no dot magnification adjustment possible There is a pupil wrt imaging the dots this is a good place for filters as it after the measurement of the wavefront and affects all subapertures equally. Dot plane Good place for filters CCD plane

Spot size/subaperture size Spot size ~λ/d, where d is the subaperture size. Typically, d is on the order of the actuator pitch (often exactly the actuator pitch Fried geometry), and is on the order of r 0 to a few r 0 at the science wavelength. For λ 0.8μ and d=40 cm, the spot size is approximately 0.8 μ/0.4m=2 μrad=0.4 arcsec Spot size trade-off bigger subapertures => more light, better SNR in centroid measurement, but poorer fit to wavefront. If subapertures are too small, then spot size increases due to diffraction, degrades spot centroid estimate (proportional to spot size) In example above, 5% spot-size displacement => 0.1 μrad => 0.1 μrad * 0.4 m = 40 nm tilt across subaperture

Plate scale Plate scale refers to the size in arcsec (on the sky) of a pixel For SH WFS, this is often ~1-2 arcsec/pixel. Plate scale trade-off Bigger pixels (in arcsec) More dynamic range w/o crosstalk between subapertures, but lets in more sky background photons (noise) More pixels per subaperture Increases linearity, at cost of more read noise and dark current in slope measurement

Typical vision science WFS Lenslets CCD Many pixels per subaperture

Typical Astronomy WFS Former Keck AO WFS sensor 2 mm 21 μ pixels 3x3 pixels/subap 200 µ lenslets relay lens CCD 3.15 reduction Low- or zero-noise detectors are starting to change astronomical WFS thinking (more pixels)

Centroiding Once you have generated spots, how do you determine their positions? The performance of the Shack-Hartmann sensor (the quality of the wavefront estimate) depends on how well the displacement of the spot is estimated. The displacement is usually estimated using the centroid (center-of-mass) estimator. This is the optimal estimator for the case where the spot is Gaussian distributed and the noise is Poisson.

Centroiding noise Due to read noise and dark current, all pixels are noisy. Pixels far from the center of the subaperture are multiplied by a large number: The pixels with the most leverage on the centroid estimate are the dimmest (therefore, the pixels with the least information), and there are lots of dim pixels The more pixels you have, the noisier the centroid estimate!

Weighted centroid The noise can be reduced by windowing the centroid:

Weighted centroid Can use a square window, a circular window: Or better still, a tapered window

Correlation (matched filtering) Find the displacement of the image that gives the maximum correlation: =!

Correlation (matched filtering) Noise is independent of number of pixels Much better noise performance for many pixels Estimate is independent of uniform background errors Estimate is relatively insensitive to assumed image Computationally more expensive This used to matter more.

Quad cells In astronomy, wavefront slope measurements are often made using a quad cell (2x2 pixels) Quad cells are faster to read and to compute the centroid and less sensitive to noise

Quad cells The estimated centroid position is linear with displacement only over a small region (small dynamic range) Sensitivity is proportional to spot size Estimated centroid position vs. displacement for different spot sizes Centroid estimated position Displacement

Denominator-free centroiding When the photon flux is very low, noise in the denominator increases the centroid error variance Centroid error can be reduced by using the average value of the denominator

Laser guide elongation Shack-Hartmann subapertures see a line not a spot Length Θ t*s / h 2 ; where t is Na layer or range gate thickness Depends on projector offset, not viewing direction

LGS elongation at Keck II Laser projected from right

A possible mitigation for LGS elongation Radial format CCD A specially oriented array of CCDs on one chip Arrange pixels to be at same orientation as spots Currently hardware testing this design for TMT laser

Dynamic refocusing for pulsed lasers Powered mirror on mechanical resonator (U of A) Segmented MEMS, one segment per subaperture (Bauman; Baranec) Rotating phase plates (e.g., Alvarez lens) (Bauman)

Problems with SH WFS Spot size is large (~ λ/d) Crucial measurement is made at junction between pixel boundaries, which are indistinct (has been reported at ~1/3 pixel charge diffusion) Worst of all worlds: photons near knife-edge generate all the noise and none of the signal!

Foucault knife-edge test Foucault (1858, 1859 (in French)) Knife-edge test for perfect lens (top), and one with spherical aberration (bottom). At right are observer views of pupil in each case. An irregular mirror tested with knifeedge test

Foucault test with mirror

Pyramid WFS Simultaneous implementation of 4 Foucault knife-edge measurements SH WFS divides aperture into subapertures (via lenslets), then field into quadrants (via pixels) PWFS does in reverse order: pyramid divides field into quadrants (via pyramid) then aperture into subapertures (via pixels) pyramid field lens pupils with CCD pixels demarking subapertures incoming beam CCD at pupil plane image plane

Pyramids are naturally quite small Size of pyramid ~ n * (λ*f#), where n is # of subapertures (natural spatial filter) have tight fabrication tolerances: Edge precision is a fraction of the full-aperture diffraction spot size (e.g., λ=1μ, f/15 sub-micron precision required. Can make beam slower to relax edge requirements, but at cost of length. can be made of glass, using cemented facets. It is difficult to make sharp edges can be based on lenslets (coming up) Advantage: if edges of pyramid can be sharp, then centroid measurement can be quite precise; indistinct CCD pixel boundaries relegated to subaperture division not crucial can transmogrify As wavefront slopes becomes small, the PWFS becomes a direct phase measuring device

SH WFS vs. PWFS Geometrically, identical just remapping of pixels. Diffractive advantage appears in high- Strehl regime.

Pyramid wave-front sensor non-linearity When the aberrations are large (e.g., defocus below), the pyramid sensor is very non-linear (reaches saturation). 4 pupil images x- and y-slopes estimates.

Modulation of pyramid sensor Without modulation: Linear over spot width With modulation: Linear over modulation width

Another pyramid implementation Pyramid + lens = 2x2 lenslet array Bauman (2003 (in English)) Lenslets are inexpensive and easily replicated. The right manufacturing technique produces sharp boundaries between lenslets (where all the action is). pyramid field lens lenslets Bauman dissertation

Brightening of rim is real effect PWFS is not quite a slope detector, but a derivative detector (effect also seen in knife-edge tests) There is a large derivative (in amplitude) at the edge of an aperture Pupils should not be too close to avoid contamination between pupil images Image of PWFS Johnson, et al., 2006

Why is a PWFS/Foucault test a slope sensor? Use Fourier optics! focal plane mask W (x,y) p CCD pupil relay lenses pupil

1/x is poor man s approximation to δ (1) (x)

How to convert SHWFS to PWFS This works only when the # of subapertures is approximately equal to the # of pixels per subaperture Otherwise, other optical changes need to be made image plane "pupilet plane" lenslet array collimating lens demagnified "pupilet plane" relay lenses WFS "dot plane" lenslet array demagnified "dot plane" relay lenses Figure 3-17: Conversion of a SH WFS to a PWFS. Top figure: Converging light from the left comes to a focus and is then collimated by a collimating lens. The collimating lens creates a pupil downstream, where the lenslet array is placed. The lenslets produce a series of images or dots at the focal plane of the lenslets or dot plane. Subsequent relay optics scale the dots as appropriate for the WFS CCD. For clarity, light from only one dot is shown after the dot plane. Bottom figure: to convert SH WFS to PWFS, remove collimating lens and translate lenslet array, relay optics, and WFS CCD upstream until the lenslet array is at the focus of the incoming beam. The lenslets now produce pupilets at the lenslet focal plane, i.e., where the dots were in the top figure. Thus, the relay optics will relay the pupilets to the WFS CCD. WFS

Curvature sensing -z Image 2 Aperture Wave-front at aperture z Image 1

Curvature sensing Developed by Roddier for AO in 1988. Linear relationship between the curvature in the aperture and the normalized intensity difference: Broadband light helps reduce diffraction effects. Tends to be used in lowerorder systems (i.e., fewer subapertures/actuators, because of higher 1 error propagation 1 Still an area of active research

Curvature sensing Using the irradiance transport equation, where I is the intensity, W is the wavefront and z is the direction of propagation, we obtain a linear, first-order approximation, which is a Poisson equation with Neumann boundary conditions.

Solution at the boundary ) ( ) ( ) ( ) ( 2 1 2 1 x x x x zw R x H zw R x H zw R x H zw R x H I I I I +! +!! +!!!! = +! I 1 I 2 I 1 - I 2 If the intensity is constant at the aperture,

Solution inside the boundary I I 1 1! + I I 2 2 =! z( W xx + W yy ) Curvature There is a linear relationship between the signal and the curvature The sensor is more sensitive for large effective propagation distances

Curvature sensing As the propagation distance, z, increases, sensitivity increases. Spatial resolution decreases. Diffraction effects increase. The relationship between the signal, (I 1 - I 2 )/(I 1 + I 2 ) and the curvature, W xx + W yy, becomes non-linear

Curvature sensing Practical implementation uses a variable curvature mirror (to obtain images below and above the aperture) and a single detector.

Curvature sensor subapertures Measure intensity in each subaperture with an avalanche photo-diode (APD) Detect individual photons no read noise