Lab #1: X-ray Photon Counting & the Statistics of Light Lab report is due Wednesday, October 11, 2017, before 11:59 pm EDT
|
|
- Amie Gilmore
- 5 years ago
- Views:
Transcription
1 Lab #1: X-ray Photon Counting & the Statistics of Light Lab report is due Wednesday, October 11, 2017, before 11:59 pm EDT 1. Overview This handout provides a description of the activities that we will explore in the first lab. Because we use this first lab to introduce many new skills, especially Python programming, this handout is a straightforward guide that your lab group should follow sequentially. 1.1 Schedule This is a four-week lab with activities and lectures on September 11, 18, 25, and October 2. You should have completed up to Section 4.1 and acquired data for Section 4.2 by September 25 th. By October 2 nd you should aim to reach Section 4.5. Your lab report is due electronically on October 11, 2017 before 11:59 pm EDT. 1.2 Goals Explore physical limitations on the detection of light. Investigate how CCD imaging sensors work and how to manipulate image data. Determine how precisely brightness can be specified, and what determines that precision. Consider how Poisson and Normal distributions can be used to explain measurements. Finally, learn how to calibrate the image sensor to produce measurements in physical units. 1.3 Reading assignments (available online on course website) Linux o To master the first lab, you will need basic Linux system skills so that you can collect data and manipulate files. Python o Work through the Python/IPython tutorials to learn how to use this powerful programming language to compute statistical quantities and make plots. Check out the lab we page and Getting started at: o Fundamental Python references are: Numerical computing: Plotting: Document preparation o You can use the TeX/LaTeX document preparation system to create publication quality reports. Download the examples and make sure that you can generate PDF output. You can then use this example as a template for your first report. Skim the statistics handout on the class web page. 1.4 Key steps You will execute eight key steps in this lab: 1. Familiarize yourself with the CCD image sensor and its adjustable parameters. Learn to acquire and process imaging data using Python. 2. Identify photon detection events using the CCD image sensor and Fe55 soft X-ray source and make plots of the counts per sample versus time. V
2 3. Plot histograms to visualize the statistical properties of different data sets. 4. Compute the mean and standard deviation for your samples and investigate the variability of the count rate. Explore how the mean and standard deviation vary as the mean counts per sample increases. 5. Compare the observed histograms with the theoretical Poisson probability distribution function. 6. For datasets consisting of multiple sequences compute the mean of the mean (MOM) and the standard deviation of the mean (SDOM). Explore how the MOM and the SDOM vary with the length of the sequences. 7. Determine the conversion gain (e - /ADU) of the CCD image sensor using the Fe55 X-ray spectrum. 8. Write your lab report. 2. Familiarizing yourself with the camera At X-ray, ultraviolet, optical and infrared wavelengths most astronomical instruments employ the photoelectric effect to convert photons into electrons, which can then be counted and recorded. Since the 1930 s astronomers have used photomultiplier tubes, which employ the photoelectric effect in vacuum. Modern detectors are based on the solid-state devices constructed from semiconductor materials. In this lab, we will investigate the properties of a CCD-style integrated circuit that uses the photoelectric effect in silicon to record visible and X-ray light. Figure 1: Basic image illustrating the photoelectric effect. The process of photon detection and counting is not perfect detectors have flaws that introduce noise and various systematic errors into the measurement process. The purpose of this lab is to construct a soft X-ray photon detector using a commercial optical CCD camera, which has had its glass window removed. In the process, you will explore the operation of a CCD sensor, the detection individual photons, and discover the statistics of light. You will carry out X-ray spectroscopy in the process, which will be used to calibrate the CCD camera. 2.1 The CCD image sensor We will use an image sensor based on a CCD (charge-coupled device) integrated circuit. A CCD sensor is a two-dimensional array of pixels where charges are shifted down each column one row at a time to a horizontal shift register. Charge is generated in each pixel by the photoelectric effect in proportion to the number of visible-light photons or the energy of an X-ray photon incident at that location. The charge within pixels in each row is then shifted onto a capacitor and read one pixel at a time. This voltage is then amplified by a preamplifier within the sensor and by another adjustable gain amplifier within the camera. Finally, this analog signal is digitized by an analogto-digital convertor, and the data is conveyed to a computer. V
3 Figure 2: Analog signal chain for a CCD sensor. The digital signal is recorded in units known as analog to digital units (ADU) after being amplified by both the sensor and the camera electronics as shown in Figure 1. Assuming that the measuring circuits are linear there is a linear relationship between ADU and the number of photoelectrons collected; this constant of proportionality is known as the gain. The entire CCD array is reset before the start of an exposure by clearing any remaining charge built up on the sensor. This establishes a well-defined charge state at the start of the subsequent exposure. The photo-charge, Q PE, is determined by measuring the voltage induced on a capacitor of capacitance, C, V = Q PE C. (1) This voltage is therefore directly proportional to the number of incident visible-light photons or the energy of a single X-ray photon detection Point Grey Chameleon3 Camera We use a two-dimensional sensor manufactured by Point Grey comprising a silicon CCD array (Figure 2). This device uses the solid-state photoelectric effect, where a UV/optical photon of sufficient energy (hv > 1.13 ev for silicon at room temperature) creates an electron-hole pair the vicinity of the depletion region of a reverse-biased diode junction. The version of the camera we are using is a monochrome version (see Table 1). The monochrome version will be used in this lab. Table 1: Point Grey Sensor Properties Specification Value Sensor type 1/3-inch Model Sony EXview HAD Monochrome CCD ICX445 Pixels count 1288 (H) x 964 (V) 1.3 Megapixels Pixel size (square) 3.75 µm Digitization Up to12-bit (8-bits are used) Gain Range 0 24 db Shutter Range ms 32 seconds Saturation Capacity ~9000 e - V
4 Figure 3: The Point Grey Chameleon3 image sensor. The light grey area at the center is the CCD array (4.86mm x 3.62mm). Figure 4: The C-mount 25-mm camera lens. The lens screws into the camera body. The designation C on the aperture ring means closed Camera Controls The first step of this lab is to log into a Linux laptop and start the FlyCap2 software application. Be sure that the USB cable is plugged into the laptop and the yellow indicator light is illuminated on the back of the camera. When you start FlyCap2, you ll be asked to select the camera only one option should be offered. The control window should open with a live display from the camera (be sure that the lens cap is removed). The FlyCap2 tool bar presents some useful options including the camera control window ( Toggle Camera Control Dialog ) and histogram tool. Open the control window and examine the Camera Settings. This is where you control the exposure time, gain (conversion from output voltage to V
5 ADUs), and frame rate. To adjust the parameters manually uncheck AUTO besides each setting. Figure 5: FlyCap2 camera control window. The automatic Brightness and Exposure controls have been disabled and the effect of those parameters have been turned off, the Shutter (exposure time) is set manually to 32 ms and the gain is set at maximum (24dB). The Camera Settings panel (Figure 5) lists a number of options including Brightness, Exposure, Gamma, Shutter, Gain and Frame Rate. Brightness modifies the display by adding a constant to the data value. Exposure means auto exposure and controls the Shutter and/or Gain if the AUTO option if either of these is enabled. Sharpness either smooths or sharpens the image. This should be turned off for our measurements. Gamma effects how images are displayed by introducing a non-linear relation between the measured signal from the ADC and the value presented in the image. Shutter determines the exposure time. The exposure time can be adjusted between ms 32 s (Most important parameter). Gain the gain of the ADC can be commanded and is variable from 0 to 24dB. This adjusts the external amplifier in the signal chain (Figure 1), and therefore the sensitivity and also the dynamic range of the camera. This is similar to the ISO settings in modern digital cameras (Most important parameter). Frame rate determines the maximum rate at which frames are read out. For example, if the frame rate is set to 30 fps (frames per second; default setting) and exposure time is 100ms, then a 10 fps frame rate will be used. V
6 For quantitative work make sure that all the AUTO buttons are unchecked and all On/Off buttons are also unchecked (i.e. off). Otherwise the software may vary the exposure time, gain, and frame rate in response to the illumination level, causing unexpected results. Set the gain to maximum to start. 3. Getting Started: Taking and Saving your First Images The first part of the lab activity is about familiarizing yourself with the camera controls and methods for acquiring images. For this activity, you need to install a 25 mm lens (Figure 4) on your camera to take visible-light images of your surrounding scene. You will need to play with the focus knob to get images in focus and adjust the aperture knob to let in more or less light. Make sure all AUTO and ON/OFF checkboxes are turned off. Try to understand how the different parameters (particularly exposure time and gain) affect your measurement. This will be important later when you set up your camera for X-ray photon measurements. The cursor readout (bottom of the live display window) and the histogram tool (Figure 6) can be used to examine the image data values. Figure 6: (Left) The histogram tool showing a histogram of values centered near the lower-range of possible values. Note that the Grey channel is checked and the y-axis range has been adjusted (Max Percent) to 10% so that the peak is clearly visible. (Right) The histogram tool can be used to plot rows or columns. On the course website, we have provided a captureimages.py script that can take a sequence of frames from the camera and save them in a specified directory. The image files from the Point Grey camera are saved in an 8-bit uncompressed TIFF format. The live view in the FlyCap software will need to be paused when use this Python script to acquire data. You can always save individual files using the FlyCap software, but not a sequence of files. We have also provided a viewimage.py file as an example of how you can read in and display a TIFF file using matplotlib. 4. Key Lab Activities This section provides the key areas you will need to investigate in this lab and discuss the results of in your lab report. The goal is to understand the properties of the image sensor V
7 4.1 Measuring image sensor properties The CCD image sensor has a limited dynamic range. The maximum attainable precision of the signal is determined by the number of bits in its analog-to-digital converter (ADC) and the maximum signal it can detect is determined by the well-depth. Both are directly affected by the camera gain setting. Explore how the either the precision and maximum signal is affected by the camera gain setting. Answer the following questions: How many bits is the ADC? At what ADC value does the signal saturate? What happens to the image information in the saturated regions? Thermal excitation of electrons in the sensor (dark current) can also be an issue in measurements as it is a signal that is not related to incident photons. Place the lens cap on the camera and explore how the average value changes with longer integration times. Also observe how the image itself changes. Because we are taking long exposures in the next section. We need to pay attention to the effect of dark current. Because it is an additive effect, dark images with the same exposure time can be subtracted from images exposed to light to remove the effect. 4.2 Detecting X-ray photons We are going to use the method of direct detection to observe the X-ray photons in this component of the lab. When the X-ray photon hits the CCD, it generates not one, but several electron-hole pairs. The number of electron-hole pairs the photon generates is proportional to its energy, which is why we can turn our CCD camera into an X-ray spectrometer. The electrons are captured and measured just like they are when we use optical light, but in this case, we will be able to see individual X-ray photons emitted by the Fe55 source! The Fe-55 source you will use has a very low activity of millicurie and is sealed. It is relatively safe to handle. However, only the lab coordinators who are properly trained will physically handle the X-ray source. Do not try to handle it yourself. Do not take the source back with you! I will get into a lot of trouble! For this section of the lab, you will have to remove the lens as the sensor needs to view the X-ray source directly. Be careful as the sensor is exposed and is very fragile! Do not put your finger into the camera barrel or drop anything on the sensor. If the camera is not in use, please put the supplied camera cover on. Collect a sequence of images after placing the X-ray source facing the camera sensor. Make sure the camera is in a dark environment. Compare with a sequence of images taken without the source and see if you observe any differences. One way to look for differences is to subtract one image from another. Consider the following questions: What do you observe? If there are changes between the multiple difference images, can you explain why? What is the reason for using difference images? Do you observe any negative values, and if so what is the reason? Note: You will need relatively long exposures (> 1 s) to see a significant number of X-ray photons. V
8 It is important that we set the dynamic range correctly so that the X-ray photon measurements do not saturate the detector. Look at the histogram plot to see if a significant portion of the X-ray photon hits are saturated. If they are, you will need to lower the detector gain and redo the measurement. On the same token, you do not want the X-ray photon hits to have too low values either. Try to adjust the camera gain so that you maximize the dynamic range, i.e. use up as many of the bits of the image value as possible. The camera gain is given in decibel units in the GUI, and you can estimate how much the conversion gain changes using the following equation: g " g # = 10 (()* +,()* - )/#0 (2) where g 1 and g 2 are in linear units while gdb 1 and gdb 2 are in decibel units. Once you have settled on a proper gain setting, now comes the challenge of identifying X-ray photon hits. When an X-ray photon hits a sensor, it does not distinguish between pixel boundaries, and sometimes it does not deposit all of its energy within a single pixel. It appears as if multiple adjacent pixels have experienced X-ray photon hits. To properly carry out a statistical analysis of X-ray photons, we need to find single pixel events, which we are certain are from a single X-ray photon. We are effectively counting individual photons here. In order to do this, you will need to write an algorithm that looks at the values of neighbouring pixels around an X-ray photon detection to ensure their values are not elevated as shown by the following diagram: You need to decide what you consider to be an elevated pixel value. If you pick a very low value, you will aggressively filter out even good single pixel events. If you pick too high of a value, you could let through multi-pixel events. Another way is to only pick events that are above a certain pixel value threshold. If you observe in Figure 6, most events are clustered at high values. This is associated with a bright X-ray emission line (see following paragraph for an explanation). If you set a threshold high enough to only select events that have pixel values associated with this emission line, you are also likely to only pick single pixel events. This is because a multi-pixel event will spread the pixel value over multiple pixels, thereby lowering an individual pixel value below the threshold. Another important piece of information to take advantage of is the number of electrons produced in a pixel from an X-ray photon hit is proportional to its energy: E 2 = N 4 W 89 (3) where E 2 is the energy of the photon in ev, N 4 is the number of electrons detected per X-ray photon hit, and W 89 is the electron-hole pair production energy within silicon of 3.66 ev at room temperature (Scholze et al. 1998). This means a 6.5 kev X-ray photon will generate 1776 e - of charge in a silicon detector. The electron-hole pair production energy is a fixed property of materials, though W has a slight dependence on temperature. This is different from the energy (1.13 ev) required for an optical photon to create a single electron-hole pair in silicon because the V
9 high energy photons also inject additional energy into modes within the crystal lattice and heat it up. If you are successful in properly filtering the single X-ray photon hits, you should obtain a figure like this which shows the pixel value of each identified single X-ray photon hit. At this point, the pixel value is arbitrary as we do not know the conversion gain. We will come back to that in Section 4.7. Figure 6: Pixel values of identified single X-ray photon events. The large overdensity of X-ray photon pixel values of 220 is the detection of a bright soft X-ray emission line. Remember that this plot was generated for a specific camera gain setting I used. Different gain settings will yield a different pixel value. 4.3 Statistics - Plotting histograms and computing the mean and standard deviation Now that we are detecting individual X-ray photons we can attempt to carry out statistical analyses. First, review and understand basic descriptive statistics including mean, standard deviation, and histograms. We will need to obtain sequences for each data set, which are a set of images taken at the same integration time and gain setting. The following plot is the number of single photon events discovered in a sequence of 50 images: V
10 Figure 7: Number of single photon events found in each of the 50 successive images taken observing the Fe55 source. The exposure time of each image is not given. IPython has all sorts of handy functions in matplotlib and numpy for statistical analysis, e.g., mean for computing mean and hist for plotting histograms. Do not use these! First, we want you understand the basic algorithms you will not learn these by using a canned routine. Second, you should always exhibit a healthy skepticism about someone else s program if you can write the function yourself, you should. Third, we want you to learn Python. By using basic arithmetic operators (+, -, *, /) and primitive functions such as numpy.sum, you can compute these quantities with a few lines of code (and do this faster than it takes to read the appropriate documentation.) For example, to compute the mean try where x is your data array: In [89]: import numpy as np In [90]: m1 = np.sum(x)/(np.size(x)) # first moment <x> = mean In [91]: m2 = np.sum(x*x)/(np.size(x)) # second moment <x^2> In [92]: std = np.sqrt(m2 - m1*m1) # std deviation (<x^2>-<x>^2)^1/2 In [93]: print 'Mean = ', m1 Mean = 4 In [94]: print 'Standard deviation = ',std Standard deviation = Although this looks right, something is seriously wrong! What? Making a histogram involves sorting the data into unique categories (binning), and counting the number of occurrences of each of those categories. The following example makes a histogram of the quantity x: In [18]: hmin = 0 # lowest bin in histogram to plot In [19]: hmax = 12 # highest bin in histogram to plot In [20]: hr = np.arange(hmin,hmax+1) # list of bin values In [21]: hist = np.array([np.where(x == i)[0].size for i in hr] ) In [23]: plt.plot(hr,hist,drawstyle='steps-mid') Figure 8: A histogram (right) for a sequence of data (left). V
11 The power of Python is apparent in two statements that make the histogram. First, we decide the highest and lowest bins we want to sample by setting hmin and hmax and then construct a list of bin values using arange: In [20]: hr = np.arange(hmin,hmax+1) In [21]: hr Out[21]: array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12]) The key step in making the histogram is counting the occurrences in each of our bin values, i.e, occurrences of the values in the list [0,1,2 12]. The function where does this as it finds where the array x has certain values, e.g., In [53]: np.where(x == 1) Out[53]: (array([20, 67, 71, 72]),) shows that elements 20, 67, 71, and 72 of x equal 1. The round brackets or parenthesis denote that where returns a Python tuple which is the list of array indices where x==1. The syntax [0] extracts the list as the first element of the tuple, e.g., In [81]: np.where(x == 1)[0] Out[81]: array([20, 67, 71, 72]) We do not care where the instances of x==1 occur, only how many occurrences there are. We can count these occurrences by using the size attribute: In [116]: np.where(x == 1)[0].size Out [116]: 4 Python often has multiple many ways to achieve the same thing; equally functional would be: In [129]: np.size(np.where(x==1)) Out[129]: 4 Now we want to repeat this counting for every element of x. To do this we iterate x using a for loop, e.g., In [134]: for i in x: print i Here i is a dummy variable which represents, in order, each element of x. Notice the colon, which indicates the beginning of commands included in the loop. We can use a for loop to iterate over each bin, e.g., In [14]: hmin = 0 In [15]: hmax = 12 In [16]: hr = np.arange(hmin,hmax+1,1) # make the list of bins V
12 In [17]: hist = np.zeros(hmax-hmin+1,dtype=np.int) In [18]: for i in hr: hist[i] = np.where(x == i)[0].size In [19]: plt.plot(hr,hist,drawstyle='steps-mid') # store the counts here # count Notice that the explicit declaration of the variable hist and the for loop replaces the enigmatic list comprehension construction: In [20]: hist = np.array([np.where(x == i)[0].size for i in hr] ) Either version works there is no single right answer. The list comprehension construction may be significantly faster in some circumstances. Now check that the histogram plot actually reflects your data. Compare the list of counts in the data file and the plot. Do a reality check on a data set where you can rapidly compute the histogram with pencil and paper 1. Once you can plot histograms with confidence, repeat the experiment, say, a few times and compare the results of your experiments. Does the histogram change? Do you always get the same mean count rate? Become adept at inspecting the histogram plot and guessing the mean and standard deviations. Calculate the mean and standard deviation of your sequences you just measured. Try measuring the photon flux within smaller regions of your sensor. You can simply look at a smaller section of the numpy array for each image to accomplish this. Measure the mean and standard deviation. Does the flux scale by pixel area? Make a plot of the mean versus the variance (the standard deviation squared). Now over plot a line with unit slope, i.e. x = y. What does this tell you about the relation between mean and variance for counting (Poisson) statistics? Is the scatter about the line x = y uniform of does it vary with the mean count? Do passion statistics break down for large means? Try plotting your data on a log-log plot by using: In [35]: plt.yscale( log ) In [36]: plt.xscale( log ) Note: You can also break up each image into several statistically independent sub-regions and use each one as a separate measurement instead of using the whole image as a single measurement. The only difference will be the mean photon flux will be correspondingly lower because it should scale as the area of the sub-region. You can use this method to probe how different count rates affect the Poisson distribution in the next section. 4.5 Poisson distribution Plot a histogram for one of your sequences with a small count rate, e.g, 2-4 counts per sample and lots of samples. Use the trick mentioned in the note above to accomplish this. Otherwise, you will need to take many images, a few hundred!, to carry out this measurement. Calculate the mean count rate and compare the resultant histogram with the theoretical Poisson probability distribution: 1 This is perhaps the most important programming lesson here. Always test your program on a problem where you know the answer. For example test your histogram program with x =[0,0,0,1,1,1] and x=[0,1,2,3,4,5,6]. V
13 P x; μ = e,> (4) where P(x; µ) is the probability x events will be measured within an interval given a mean of µ events. How do you compare a histogram, which represents measured counts with a theoretical distribution!? The Poisson distribution gives a probability. You have measured counts. Explain how to choose the correct scaling factor (or normalization) to compare the measured and theoretical distributions. Can you use a theoretical probability to predict counts or can you express your measured counts as a probability? Does the Poisson distribution provide a good description of the data? Now arrange so that the events per sample is increased. Aim for a few hundred events per sample. Plot the histogram again. What has happened to the shape of the histogram is it qualitatively different? Does the Poission distribution still apply? Calculate the mean and standard deviation and over-plot the corresponding Gaussian probability distribution, " P x; μ, σ = exp #. (5) E #F # E Is a Gaussian curve a good approximation to the Poisson distribution? Under what conditions is the Gaussian probability distribution a good approximation? 4.6 Standard deviation of the mean The more events you count the more accurately you can measure the number of counts per sample (i.e., the mean count rate). To illustrate the effect, take ten sets of data consisting of 10 exposures. Compute the number of events within a 100x100 region in each image. For each of these ten sets calculate the mean. Due to the statistical variations the ten means will be different, so also calculate the mean of the means (MOM) and the standard deviation of the means (SDOM). The MOM is the best estimate of the counts per sample and the SDOM is a measure of how precisely we know the average counts per sample. How does the SDOM vary with the number of samples in the individual sequences? Experience suggests that if we have more samples in each of our ten measurements the SDOM will be smaller. To quantify this effect, repeat by taking more 100x100 sub-regions within each image. Try including 2, 4, 8, 16, 32, 64, etc. sub-regions in each image. For each number of sub-regions consider the ten data sets and calculate the mean of the means (MOM) and the standard deviation of the means (SDOM). Plot the MOM and the SDOM as a function of the number of sub-regions. Describe how the MOM and SDOM vary as the sub-region increases. If you want, you can try using smaller sub-regions to increase the maximum number of sub-regions within an image. Based on your knowledge of Poisson statistics and error propagation predict the SDOM given the measured mean count per sample and the sample size. Makes plots that compare your prediction with the data. If I want to improve the accuracy of a measurement of the mean by a factor of two, by what factor do I need to increase the number of samples? V
14 How accurate is your best estimate of the count rate, i.e., how accurate is the MOM? 4.7 X-ray spectrum and calibration of image sensor The decay of Fe55 produces bright soft X-ray emission lines in the few kev energy range. Fe55 decays by electron capture to manganese-55 (Mn55). As described in Section 4.2, the CCD image sensor acts as a spectrometer because for every X-ray photon event it produces a certain number of photoelectrons, which is proportional to the X-ray photon energy. Our camera can resolve up to three X-ray emission lines from our source. To view the spectrum of the source, generate a histogram of the pixel value for each X-ray event. You should see at least one strong peak which is associated with the Ka line of Mn55. Given that we know the energy of the Ka line, we can calculate the corresponding number of photoelectrons generated from a photon with this energy using Equation 3. Since we know the pixel value in ADU and the number of electrons associated with a given photon event, we can calculate the conversion gain: e - /ADU. This allows the conversion of arbitrary pixel values (ADUs) into real physical units (e - ). Compute your conversion gain for the data you used in the previous sections. Be sure to indicate your camera gain setting in the FlyCap software. This method is actually used to calibrate astronomical CCDs that are used in the field. The measurement of conversion is critical to understanding the noise properties of CCDs, their charge transfer efficiency, and their throughput on sky. V
X-ray Spectroscopy Laboratory Suresh Sivanandam Dunlap Institute for Astronomy & Astrophysics, University of Toronto
X-ray Spectroscopy Laboratory Suresh Sivanandam, 1 Introduction & Objectives At X-ray, ultraviolet, optical and infrared wavelengths most astronomical instruments employ the photoelectric effect to convert
More informationCCD Characteristics Lab
CCD Characteristics Lab Observational Astronomy 6/6/07 1 Introduction In this laboratory exercise, you will be using the Hirsch Observatory s CCD camera, a Santa Barbara Instruments Group (SBIG) ST-8E.
More informationThe Noise about Noise
The Noise about Noise I have found that few topics in astrophotography cause as much confusion as noise and proper exposure. In this column I will attempt to present some of the theory that goes into determining
More informationInstruction Manual for HyperScan Spectrometer
August 2006 Version 1.1 Table of Contents Section Page 1 Hardware... 1 2 Mounting Procedure... 2 3 CCD Alignment... 6 4 Software... 7 5 Wiring Diagram... 19 1 HARDWARE While it is not necessary to have
More informationDetectors for microscopy - CCDs, APDs and PMTs. Antonia Göhler. Nov 2014
Detectors for microscopy - CCDs, APDs and PMTs Antonia Göhler Nov 2014 Detectors/Sensors in general are devices that detect events or changes in quantities (intensities) and provide a corresponding output,
More informationCHARGE-COUPLED DEVICE (CCD)
CHARGE-COUPLED DEVICE (CCD) Definition A charge-coupled device (CCD) is an analog shift register, enabling analog signals, usually light, manipulation - for example, conversion into a digital value that
More informationCCD reductions techniques
CCD reductions techniques Origin of noise Noise: whatever phenomena that increase the uncertainty or error of a signal Origin of noises: 1. Poisson fluctuation in counting photons (shot noise) 2. Pixel-pixel
More informationGamma Spectrometer Initial Project Proposal
Gamma Spectrometer Initial Project Proposal Group 9 Aman Kataria Johnny Klarenbeek Dean Sullivan David Valentine Introduction There are currently two main types of gamma radiation detectors used for gamma
More informationDigital camera. Sensor. Memory card. Circuit board
Digital camera Circuit board Memory card Sensor Detector element (pixel). Typical size: 2-5 m square Typical number: 5-20M Pixel = Photogate Photon + Thin film electrode (semi-transparent) Depletion volume
More informationCamera Test Protocol. Introduction TABLE OF CONTENTS. Camera Test Protocol Technical Note Technical Note
Technical Note CMOS, EMCCD AND CCD CAMERAS FOR LIFE SCIENCES Camera Test Protocol Introduction The detector is one of the most important components of any microscope system. Accurate detector readings
More informationCharged Coupled Device (CCD) S.Vidhya
Charged Coupled Device (CCD) S.Vidhya 02.04.2016 Sensor Physical phenomenon Sensor Measurement Output A sensor is a device that measures a physical quantity and converts it into a signal which can be read
More informationPhotons and solid state detection
Photons and solid state detection Photons represent discrete packets ( quanta ) of optical energy Energy is hc/! (h: Planck s constant, c: speed of light,! : wavelength) For solid state detection, photons
More informationAstronomy 341 Fall 2012 Observational Astronomy Haverford College. CCD Terminology
CCD Terminology Read noise An unavoidable pixel-to-pixel fluctuation in the number of electrons per pixel that occurs during chip readout. Typical values for read noise are ~ 10 or fewer electrons per
More informationPersistence Characterisation of Teledyne H2RG detectors
Persistence Characterisation of Teledyne H2RG detectors Simon Tulloch European Southern Observatory, Karl Schwarzschild Strasse 2, Garching, 85748, Germany. Abstract. Image persistence is a major problem
More informationProperties of a Detector
Properties of a Detector Quantum Efficiency fraction of photons detected wavelength and spatially dependent Dynamic Range difference between lowest and highest measurable flux Linearity detection rate
More information4.5.1 Mirroring Gain/Offset Registers GPIO CMV Snapshot Control... 14
Thank you for choosing the MityCAM-C8000 from Critical Link. The MityCAM-C8000 MityViewer Quick Start Guide will guide you through the software installation process and the steps to acquire your first
More informationAST Lab exercise: CCD
AST2210 - Lab exercise: CCD 1 Introduction In this project we will study the performance of a standard CCD, similar to those used in astronomical observations. In particular, the exercise will take you
More informationControl of Noise and Background in Scientific CMOS Technology
Control of Noise and Background in Scientific CMOS Technology Introduction Scientific CMOS (Complementary metal oxide semiconductor) camera technology has enabled advancement in many areas of microscopy
More informationTechnical Notes. Integrating Sphere Measurement Part II: Calibration. Introduction. Calibration
Technical Notes Integrating Sphere Measurement Part II: Calibration This Technical Note is Part II in a three part series examining the proper maintenance and use of integrating sphere light measurement
More informationTutors Dominik Dannheim, Thibault Frisson (CERN, Geneva, Switzerland)
Danube School on Instrumentation in Elementary Particle & Nuclear Physics University of Novi Sad, Serbia, September 8 th 13 th, 2014 Lab Experiment: Characterization of Silicon Photomultipliers Dominik
More informationInterpixel crosstalk in a 3D-integrated active pixel sensor for x-ray detection
Interpixel crosstalk in a 3D-integrated active pixel sensor for x-ray detection The MIT Faculty has made this article openly available. Please share how this access benefits you. Your story matters. Citation
More informationDigital Imaging Rochester Institute of Technology
Digital Imaging 1999 Rochester Institute of Technology So Far... camera AgX film processing image AgX photographic film captures image formed by the optical elements (lens). Unfortunately, the processing
More informationHoriba LabRAM ARAMIS Raman Spectrometer Revision /28/2016 Page 1 of 11. Horiba Jobin-Yvon LabRAM Aramis - Raman Spectrometer
Page 1 of 11 Horiba Jobin-Yvon LabRAM Aramis - Raman Spectrometer The Aramis Raman system is a software selectable multi-wavelength Raman system with mapping capabilities with a 400mm monochromator and
More informationCHAPTER 11 HPD (Hybrid Photo-Detector)
CHAPTER 11 HPD (Hybrid Photo-Detector) HPD (Hybrid Photo-Detector) is a completely new photomultiplier tube that incorporates a semiconductor element in an evacuated electron tube. In HPD operation, photoelectrons
More informationIntroduction to Computer Vision
Introduction to Computer Vision CS / ECE 181B Thursday, April 1, 2004 Course Details HW #0 and HW #1 are available. Course web site http://www.ece.ucsb.edu/~manj/cs181b Syllabus, schedule, lecture notes,
More informationPhotoelectric effect
Photoelectric effect Objective Study photoelectric effect. Measuring and Calculating Planck s constant, h. Measuring Current-Voltage Characteristics of photoelectric Spectral Lines. Theory Experiments
More informationAstronomical Detectors. Lecture 3 Astronomy & Astrophysics Fall 2011
Astronomical Detectors Lecture 3 Astronomy & Astrophysics Fall 2011 Detector Requirements Record incident photons that have been captured by the telescope. Intensity, Phase, Frequency, Polarization Difficulty
More informationHomework Set 3.5 Sensitive optoelectronic detectors: seeing single photons
Homework Set 3.5 Sensitive optoelectronic detectors: seeing single photons Due by 12:00 noon (in class) on Tuesday, Nov. 7, 2006. This is another hybrid lab/homework; please see Section 3.4 for what you
More informationGamma Ray Spectroscopy with NaI(Tl) and HPGe Detectors
Nuclear Physics #1 Gamma Ray Spectroscopy with NaI(Tl) and HPGe Detectors Introduction: In this experiment you will use both scintillation and semiconductor detectors to study γ- ray energy spectra. The
More informationPh 3455 The Photoelectric Effect
Ph 3455 The Photoelectric Effect Required background reading Tipler, Llewellyn, section 3-3 Prelab Questions 1. In this experiment you will be using a mercury lamp as the source of photons. At the yellow
More informationCharged-Coupled Devices
Charged-Coupled Devices Charged-Coupled Devices Useful texts: Handbook of CCD Astronomy Steve Howell- Chapters 2, 3, 4.4 Measuring the Universe George Rieke - 3.1-3.3, 3.6 CCDs CCDs were invented in 1969
More informationSTEM Spectrum Imaging Tutorial
STEM Spectrum Imaging Tutorial Gatan, Inc. 5933 Coronado Lane, Pleasanton, CA 94588 Tel: (925) 463-0200 Fax: (925) 463-0204 April 2001 Contents 1 Introduction 1.1 What is Spectrum Imaging? 2 Hardware 3
More informationErrata to First Printing 1 2nd Edition of of The Handbook of Astronomical Image Processing
Errata to First Printing 1 nd Edition of of The Handbook of Astronomical Image Processing 1. Page 47: In nd line of paragraph. Following Equ..17, change 4 to 14. Text should read as follows: The dark frame
More informationLaboratory 1: Uncertainty Analysis
University of Alabama Department of Physics and Astronomy PH101 / LeClair May 26, 2014 Laboratory 1: Uncertainty Analysis Hypothesis: A statistical analysis including both mean and standard deviation can
More informationUniversity of Wisconsin Chemistry 524 Spectroscopic Components *
University of Wisconsin Chemistry 524 Spectroscopic Components * In journal articles, presentations, and textbooks, chemical instruments are often represented as block diagrams. These block diagrams highlight
More informationExercise questions for Machine vision
Exercise questions for Machine vision This is a collection of exercise questions. These questions are all examination alike which means that similar questions may appear at the written exam. I ve divided
More informationAbstract. Preface. Acknowledgments
Contents Abstract Preface Acknowledgments iv v vii 1 Introduction 1 1.1 A Very Brief History of Visible Detectors in Astronomy................ 1 1.2 The CCD: Astronomy s Champion Workhorse......................
More informationLight gathering Power: Magnification with eyepiece:
Telescopes Light gathering Power: The amount of light that can be gathered by a telescope in a given amount of time: t 1 /t 2 = (D 2 /D 1 ) 2 The larger the diameter the smaller the amount of time. If
More informationScanArray Overview. Principle of Operation. Instrument Components
ScanArray Overview The GSI Lumonics ScanArrayÒ Microarray Analysis System is a scanning laser confocal fluorescence microscope that is used to determine the fluorescence intensity of a two-dimensional
More informationDetermining MTF with a Slant Edge Target ABSTRACT AND INTRODUCTION
Determining MTF with a Slant Edge Target Douglas A. Kerr Issue 2 October 13, 2010 ABSTRACT AND INTRODUCTION The modulation transfer function (MTF) of a photographic lens tells us how effectively the lens
More informationFRAUNHOFER AND FRESNEL DIFFRACTION IN ONE DIMENSION
FRAUNHOFER AND FRESNEL DIFFRACTION IN ONE DIMENSION Revised November 15, 2017 INTRODUCTION The simplest and most commonly described examples of diffraction and interference from two-dimensional apertures
More informationWFC3 TV3 Testing: IR Channel Nonlinearity Correction
Instrument Science Report WFC3 2008-39 WFC3 TV3 Testing: IR Channel Nonlinearity Correction B. Hilbert 2 June 2009 ABSTRACT Using data taken during WFC3's Thermal Vacuum 3 (TV3) testing campaign, we have
More information8.2 IMAGE PROCESSING VERSUS IMAGE ANALYSIS Image processing: The collection of routines and
8.1 INTRODUCTION In this chapter, we will study and discuss some fundamental techniques for image processing and image analysis, with a few examples of routines developed for certain purposes. 8.2 IMAGE
More informationBASLER A601f / A602f
Camera Specification BASLER A61f / A6f Measurement protocol using the EMVA Standard 188 3rd November 6 All values are typical and are subject to change without prior notice. CONTENTS Contents 1 Overview
More informationAstro-photography. Daguerreotype: on a copper plate
AST 1022L Astro-photography 1840-1980s: Photographic plates were astronomers' main imaging tool At right: first ever picture of the full moon, by John William Draper (1840) Daguerreotype: exposure using
More informationCCD User s Guide SBIG ST7E CCD camera and Macintosh ibook control computer with Meade flip mirror assembly mounted on LX200
Massachusetts Institute of Technology Department of Earth, Atmospheric, and Planetary Sciences Handout 8 /week of 2002 March 18 12.409 Hands-On Astronomy, Spring 2002 CCD User s Guide SBIG ST7E CCD camera
More informationINTRODUCTION TO CCD IMAGING
ASTR 1030 Astronomy Lab 85 Intro to CCD Imaging INTRODUCTION TO CCD IMAGING SYNOPSIS: In this lab we will learn about some of the advantages of CCD cameras for use in astronomy and how to process an image.
More informationPixel Response Effects on CCD Camera Gain Calibration
1 of 7 1/21/2014 3:03 PM HO M E P R O D UC T S B R IE F S T E C H NO T E S S UP P O RT P UR C HA S E NE W S W E B T O O L S INF O C O NTA C T Pixel Response Effects on CCD Camera Gain Calibration Copyright
More informationAPPENDIX D: ANALYZING ASTRONOMICAL IMAGES WITH MAXIM DL
APPENDIX D: ANALYZING ASTRONOMICAL IMAGES WITH MAXIM DL Written by T.Jaeger INTRODUCTION Early astronomers relied on handmade sketches to record their observations (see Galileo s sketches of Jupiter s
More informationInformation & Instructions
KEY FEATURES 1. USB 3.0 For the Fastest Transfer Rates Up to 10X faster than regular USB 2.0 connections (also USB 2.0 compatible) 2. High Resolution 4.2 MegaPixels resolution gives accurate profile measurements
More informationOptical Performance of Nikon F-Mount Lenses. Landon Carter May 11, Measurement and Instrumentation
Optical Performance of Nikon F-Mount Lenses Landon Carter May 11, 2016 2.671 Measurement and Instrumentation Abstract In photographic systems, lenses are one of the most important pieces of the system
More informationAn Introduction to CCDs. The basic principles of CCD Imaging is explained.
An Introduction to CCDs. The basic principles of CCD Imaging is explained. Morning Brain Teaser What is a CCD? Charge Coupled Devices (CCDs), invented in the 1970s as memory devices. They improved the
More informationNature Methods: doi: /nmeth Supplementary Figure 1. Resolution of lysozyme microcrystals collected by continuous rotation.
Supplementary Figure 1 Resolution of lysozyme microcrystals collected by continuous rotation. Lysozyme microcrystals were visualized by cryo-em prior to data collection and a representative crystal is
More informationImage acquisition. In both cases, the digital sensing element is one of the following: Line array Area array. Single sensor
Image acquisition Digital images are acquired by direct digital acquisition (digital still/video cameras), or scanning material acquired as analog signals (slides, photographs, etc.). In both cases, the
More informationClass #9: Experiment Diodes Part II: LEDs
Class #9: Experiment Diodes Part II: LEDs Purpose: The objective of this experiment is to become familiar with the properties and uses of LEDs, particularly as a communication device. This is a continuation
More informationObservational Astronomy
Observational Astronomy Instruments The telescope- instruments combination forms a tightly coupled system: Telescope = collecting photons and forming an image Instruments = registering and analyzing the
More informationFTA SI-640 High Speed Camera Installation and Use
FTA SI-640 High Speed Camera Installation and Use Last updated November 14, 2005 Installation The required drivers are included with the standard Fta32 Video distribution, so no separate folders exist
More informationWhite Paper High Dynamic Range Imaging
WPE-2015XI30-00 for Machine Vision What is Dynamic Range? Dynamic Range is the term used to describe the difference between the brightest part of a scene and the darkest part of a scene at a given moment
More informationThermography. White Paper: Understanding Infrared Camera Thermal Image Quality
Electrophysics Resource Center: White Paper: Understanding Infrared Camera 373E Route 46, Fairfield, NJ 07004 Phone: 973-882-0211 Fax: 973-882-0997 www.electrophysics.com Understanding Infared Camera Electrophysics
More informationPhotometry. Variable Star Photometry
Variable Star Photometry Photometry One of the most basic of astronomical analysis is photometry, or the monitoring of the light output of an astronomical object. Many stars, be they in binaries, interacting,
More informationCameras CS / ECE 181B
Cameras CS / ECE 181B Image Formation Geometry of image formation (Camera models and calibration) Where? Radiometry of image formation How bright? What color? Examples of cameras What is a Camera? A camera
More informationImage Enhancement (from Chapter 13) (V6)
Image Enhancement (from Chapter 13) (V6) Astronomical images often span a wide range of brightness, while important features contained in them span a very narrow range of brightness. Alternatively, interesting
More informationOn spatial resolution
On spatial resolution Introduction How is spatial resolution defined? There are two main approaches in defining local spatial resolution. One method follows distinction criteria of pointlike objects (i.e.
More informationAgilEye Manual Version 2.0 February 28, 2007
AgilEye Manual Version 2.0 February 28, 2007 1717 Louisiana NE Suite 202 Albuquerque, NM 87110 (505) 268-4742 support@agiloptics.com 2 (505) 268-4742 v. 2.0 February 07, 2007 3 Introduction AgilEye Wavefront
More informationInstructions for the Experiment
Instructions for the Experiment Excitonic States in Atomically Thin Semiconductors 1. Introduction Alongside with electrical measurements, optical measurements are an indispensable tool for the study of
More informationPh 3324 The Scintillation Detector and Gamma Ray Spectroscopy
Ph 3324 The Scintillation Detector and Gamma Ray Spectroscopy Required background reading Attached are several pages from an appendix on the web for Tipler-Llewellyn Modern Physics. Read the section on
More informationPresented by Jerry Hubbell Lake of the Woods Observatory (MPC I24) President, Rappahannock Astronomy Club
Presented by Jerry Hubbell Lake of the Woods Observatory (MPC I24) President, Rappahannock Astronomy Club ENGINEERING A FIBER-FED FED SPECTROMETER FOR ASTRONOMICAL USE Objectives Discuss the engineering
More informationBasler aca640-90gm. Camera Specification. Measurement protocol using the EMVA Standard 1288 Document Number: BD Version: 02
Basler aca64-9gm Camera Specification Measurement protocol using the EMVA Standard 1288 Document Number: BD584 Version: 2 For customers in the U.S.A. This equipment has been tested and found to comply
More informationSystem and method for subtracting dark noise from an image using an estimated dark noise scale factor
Page 1 of 10 ( 5 of 32 ) United States Patent Application 20060256215 Kind Code A1 Zhang; Xuemei ; et al. November 16, 2006 System and method for subtracting dark noise from an image using an estimated
More informationME 6406 MACHINE VISION. Georgia Institute of Technology
ME 6406 MACHINE VISION Georgia Institute of Technology Class Information Instructor Professor Kok-Meng Lee MARC 474 Office hours: Tues/Thurs 1:00-2:00 pm kokmeng.lee@me.gatech.edu (404)-894-7402 Class
More informationFigure 1 HDR image fusion example
TN-0903 Date: 10/06/09 Using image fusion to capture high-dynamic range (hdr) scenes High dynamic range (HDR) refers to the ability to distinguish details in scenes containing both very bright and relatively
More informationA flexible compact readout circuit for SPAD arrays ABSTRACT Keywords: 1. INTRODUCTION 2. THE SPAD 2.1 Operation 7780C - 55
A flexible compact readout circuit for SPAD arrays Danial Chitnis * and Steve Collins Department of Engineering Science University of Oxford Oxford England OX13PJ ABSTRACT A compact readout circuit that
More informationAmorphous Selenium Direct Radiography for Industrial Imaging
DGZfP Proceedings BB 67-CD Paper 22 Computerized Tomography for Industrial Applications and Image Processing in Radiology March 15-17, 1999, Berlin, Germany Amorphous Selenium Direct Radiography for Industrial
More informationElectronic Instrumentation for Radiation Detection Systems
Electronic Instrumentation for Radiation Detection Systems January 23, 2018 Joshua W. Cates, Ph.D. and Craig S. Levin, Ph.D. Course Outline Lecture Overview Brief Review of Radiation Detectors Detector
More informationDECISION NUMBER FOURTEEN TO THE TREATY ON OPEN SKIES
DECISION NUMBER FOURTEEN TO THE TREATY ON OPEN SKIES OSCC.DEC 14 12 October 1994 METHODOLOGY FOR CALCULATING THE MINIMUM HEIGHT ABOVE GROUND LEVEL AT WHICH EACH VIDEO CAMERA WITH REAL TIME DISPLAY INSTALLED
More informationEXPERIMENT 3 THE PHOTOELECTRIC EFFECT
EXPERIMENT 3 THE PHOTOELECTRIC EFFECT Equipment List Included Equipment 1. Mercury Light Source Enclosure 2. Track, 60 cm 3. Photodiode Enclosure 4. Mercury Light Source Power Supply 5. DC Current Amplifier
More informationExamination, TEN1, in courses SK2500/SK2501, Physics of Biomedical Microscopy,
KTH Applied Physics Examination, TEN1, in courses SK2500/SK2501, Physics of Biomedical Microscopy, 2009-06-05, 8-13, FB51 Allowed aids: Compendium Imaging Physics (handed out) Compendium Light Microscopy
More informationTraining Guide for Leica SP8 Confocal/Multiphoton Microscope
Training Guide for Leica SP8 Confocal/Multiphoton Microscope LAS AF v3.3 Optical Imaging & Vital Microscopy Core Baylor College of Medicine (2017) Power ON Routine 1 2 Turn ON power switch for epifluorescence
More informationEXPRIMENT 3 COUPLING FIBERS TO SEMICONDUCTOR SOURCES
EXPRIMENT 3 COUPLING FIBERS TO SEMICONDUCTOR SOURCES OBJECTIVES In this lab, firstly you will learn to couple semiconductor sources, i.e., lightemitting diodes (LED's), to optical fibers. The coupling
More informationImage Processing Tutorial Basic Concepts
Image Processing Tutorial Basic Concepts CCDWare Publishing http://www.ccdware.com 2005 CCDWare Publishing Table of Contents Introduction... 3 Starting CCDStack... 4 Creating Calibration Frames... 5 Create
More informationISIS A beginner s guide
ISIS A beginner s guide Conceived of and written by Christian Buil, ISIS is a powerful astronomical spectral processing application that can appear daunting to first time users. While designed as a comprehensive
More informationPentaVac Vacuum Technology
PentaVac Vacuum Technology Scientific CCD Applications CCD imaging sensors are used extensively in high-end imaging applications, enabling acquisition of quantitative images with both high (spatial) resolution
More informationZeiss 780 Training Notes
Zeiss 780 Training Notes Turn on Main Switch, System PC and Components Switches 780 Start up sequence Do you need the argon laser (458, 488, 514 nm lines)? Yes Turn on the laser s main power switch and
More informationAn Introduction to the Silicon Photomultiplier
An Introduction to the Silicon Photomultiplier The Silicon Photomultiplier (SPM) addresses the challenge of detecting, timing and quantifying low-light signals down to the single-photon level. Traditionally
More informationCCDS. Lesson I. Wednesday, August 29, 12
CCDS Lesson I CCD OPERATION The predecessor of the CCD was a device called the BUCKET BRIGADE DEVICE developed at the Phillips Research Labs The BBD was an analog delay line, made up of capacitors such
More informationROBOT VISION. Dr.M.Madhavi, MED, MVSREC
ROBOT VISION Dr.M.Madhavi, MED, MVSREC Robotic vision may be defined as the process of acquiring and extracting information from images of 3-D world. Robotic vision is primarily targeted at manipulation
More informationMultianode Photo Multiplier Tubes as Photo Detectors for Ring Imaging Cherenkov Detectors
Multianode Photo Multiplier Tubes as Photo Detectors for Ring Imaging Cherenkov Detectors F. Muheim a edin]department of Physics and Astronomy, University of Edinburgh Mayfield Road, Edinburgh EH9 3JZ,
More informationVery short introduction to light microscopy and digital imaging
Very short introduction to light microscopy and digital imaging Hernan G. Garcia August 1, 2005 1 Light Microscopy Basics In this section we will briefly describe the basic principles of operation and
More informationInstructions for gg Coincidence with 22 Na. Overview of the Experiment
Overview of the Experiment Instructions for gg Coincidence with 22 Na 22 Na is a radioactive element that decays by converting a proton into a neutron: about 90% of the time through β + decay and about
More informatione2v Launches New Onyx 1.3M for Premium Performance in Low Light Conditions
e2v Launches New Onyx 1.3M for Premium Performance in Low Light Conditions e2v s Onyx family of image sensors is designed for the most demanding outdoor camera and industrial machine vision applications,
More informationNGC user report. Gert Finger
NGC user report Gert Finger Overview user s perspective of the transition from IRACE to NGC Performance of NGC prototypes with optical and infrared detectors Implementation of two special features on the
More informationNAME SECTION PERFORMANCE TASK # 3. Part I. Qualitative Relationships
NAME SECTION PARTNERS DATE PERFORMANCE TASK # 3 You must work in teams of three or four (ask instructor) and will turn in ONE report. Answer all questions. Write in complete sentences. You must hand this
More informationThe new CMOS Tracking Camera used at the Zimmerwald Observatory
13-0421 The new CMOS Tracking Camera used at the Zimmerwald Observatory M. Ploner, P. Lauber, M. Prohaska, P. Schlatter, J. Utzinger, T. Schildknecht, A. Jaeggi Astronomical Institute, University of Bern,
More informationGXCapture 8.1 Instruction Manual
GT Vision image acquisition, managing and processing software GXCapture 8.1 Instruction Manual Contents of the Instruction Manual GXC is the shortened name used for GXCapture Square brackets are used to
More informationTraining Guide for Carl Zeiss LSM 5 LIVE Confocal Microscope
Training Guide for Carl Zeiss LSM 5 LIVE Confocal Microscope AIM 4.2 Optical Imaging & Vital Microscopy Core Baylor College of Medicine (2017) Power ON Routine 1 2 Verify that main power switches on the
More informationGoal of the project. TPC operation. Raw data. Calibration
Goal of the project The main goal of this project was to realise the reconstruction of α tracks in an optically read out GEM (Gas Electron Multiplier) based Time Projection Chamber (TPC). Secondary goal
More informationFIBER OPTICS. Prof. R.K. Shevgaonkar. Department of Electrical Engineering. Indian Institute of Technology, Bombay. Lecture: 20
FIBER OPTICS Prof. R.K. Shevgaonkar Department of Electrical Engineering Indian Institute of Technology, Bombay Lecture: 20 Photo-Detectors and Detector Noise Fiber Optics, Prof. R.K. Shevgaonkar, Dept.
More informationGentec-EO USA. T-RAD-USB Users Manual. T-Rad-USB Operating Instructions /15/2010 Page 1 of 24
Gentec-EO USA T-RAD-USB Users Manual Gentec-EO USA 5825 Jean Road Center Lake Oswego, Oregon, 97035 503-697-1870 voice 503-697-0633 fax 121-201795 11/15/2010 Page 1 of 24 System Overview Welcome to the
More informationLecture 8 Optical Sensing. ECE 5900/6900 Fundamentals of Sensor Design
ECE 5900/6900: Fundamentals of Sensor Design Lecture 8 Optical Sensing 1 Optical Sensing Q: What are we measuring? A: Electromagnetic radiation labeled as Ultraviolet (UV), visible, or near,mid-, far-infrared
More informationCHAPTER 9 POSITION SENSITIVE PHOTOMULTIPLIER TUBES
CHAPTER 9 POSITION SENSITIVE PHOTOMULTIPLIER TUBES The current multiplication mechanism offered by dynodes makes photomultiplier tubes ideal for low-light-level measurement. As explained earlier, there
More information