Machine Vision: Image Formation MediaRobotics Lab, Feb 2010 References: Forsyth / Ponce: Computer Vision Horn: Robot Vision Kodak CCD Primer, #KCP-001 Adaptive Fuzzy Color Interpolation, Journal of Electronic Imaging Vol. 11(3), July 2002.
http://homepages.inf.ed.ac.uk/rbf/cvonline/local_copies/owens/lect1/node2.html
f
1 1 1 = image object focallength Fstop = focal length / diameter of lens Fstop increments with a doubling of light capture capacity (which is a function of area): n 2 = 1, 1.4, 2. 2.8, 4, 5.6, 8, 11, 16, 22, 32 Fstop = 2 aperture for n = 0,1,2,3,4..
Camera Obscura, Reinerus Gemma-Frisius, 1544. Found in Gernsheim, H., The Origins of Photography
http://www.robotfunk.com/rss/make_magazine/2008/03/
Jospeh Nicéphore Niépce, View from the Window at Le Gras, 1826 (exposure: ~8hrs)
Louis Daguerre, first daguerreotype, 1837 image exposed directly onto a mirror-polished surface of silver halide particles deposited by iodine vapor. (exposure time 15 minutes)
Henry Peach Robinson, 1858 First photomontage (combining multiple negatives)
Image carriers Niepce, 1826? paper soaked in silver chloride in camera obscura Daguerre, 1836 mercury fumes on silver platting on copper Legray/Archer, 1850 wet-plate negative/positive process Maddox, 1870 gelatin process (no immediate development needed) Eastman, 1889 photographic film Lumiere, 1908 color photographic film CCD, 1970s charge coupled devices (originally designed for computer memory )
CCD: Charge Coupled Device -1969 Phillips Research Labs (Sangster/Teer) invent the BucketBrigade Device (transfers packets from one transistor to another) -1970 Bell Labs (Boyle/Smith) extend concept by inventing transport mechanism from one capacitor to a second one >> charge coupled device >> a memory device that happens to be sensitive to light -1973 JPL initiates Scientific Grade large array CCD program >>first used as image sensors in astronomy
.. Maxwell >> Hertz/Planck >> Einstein >> Feynman 4hf de = h*f 3hf 2hf 1hf 0 E = h*f = h*c / lambda h*f = emission energy ( "Work Function W" ) + 1/2 mv^2 h = Planck's Constant = 6.63 x 10-34 Js v = max speed
CCD: Charge Coupled Device - 500 000 per 1/3 inch chip,distributed according to human color sensitivity (more green than red, blue) -light hits array of detector cells with de = h*f - light sensitive diodes, sensitive to R, or G or B band, translate the flux of light energy into electric charge. - electrons freed and stored in potential wells as charge - sequential reading of this charge and conversion to voltage - linear mapping of this voltage to light intensity - filtering and spatial interpolation to derive 3 base color bands - combine the 3 base color bands to represent any color... - scanning the chip line by line results in a video signal
Source: Kodak CCD Primer, #KCP-001
CCD performance a function of: - wavelength sensitivity of cells - number of cells/pixels (row x column) - physical size of each pixel (6-20 um) - depth of cell: # of bits to code brightness - noise cancellation techniques
Noise Thermal noise: photons and thermal energy can free electrons (>> dark current) Photodiode noise: CCD is impure, imperfect, Q/E not 100% Photon noise: not time constant occurrence Electronics noise: stray capacitance vary effective voltage
Source: Kodak CCD Primer, #KCP-001
Source: Kodak CCD Primer, #KCP-001
Source: Kodak CCD Primer, #KCP-001
Source: Kodak CCD Primer, #KCP-001
Single CCD >> spatial color interpolation - columns of alternating colors - 25% more green >>human vision more sensitive to G - each cell has an intensity value of 0-255 - merge data from N (4) cells to map to a colored pixel value Bayer filter
Fuji Film, Super CCD
Charge versus time graphs for an RGB pixel inside a CCD cell
Source: Kodak CCD Primer, #KCP-001
On-chip micro lens Color filter Photo shield Poly Silicon Register Sensor SONY HAD sensor, schematic
Camera: Image formation (lens), image capture (CCD) CCD: charge coupled device Reprinted from the January 2001 issue of PHOTONICS SPECTRA
AMTEL TH7887A Area Array CCD Image Sensor 1024x1024
Colour signals Chroma Subsampling Standards RGB representation, YIQ (NTSC) and YCrCb or YUV (PAL) are commonly used in video The human eye itself has less spatial resolution in colour than in luminance. most video standards use some form of chroma subsampling. The two chrominance signals can tolerate a reduced resolution relative to that of the luminance signal, and various digitization formats exploit this: YIQ and YCrCb use properties of the human eye to priorities information. 1 luminance (Y) and 2 chrominance components are used to describe the colour of each pixel in both YIQ and YCrCb luminance components (Y) describe the brightness of pixels, varies from black (lowest) to white (highest) Y = 0.299R+0.587G+0.144B chrominance components carry the colour information of pixels Cr=(R-Y)/2+0.5, Cb= (B-Y)/1.6+0.5 in YCrCb I= 0.596R -0.275G -0.321B Q = 0.212R -0.528G + 0.311B in YIQ 4:4:4: no subsampling of chrominance components; 4:2:2: both chrominance components are subsampled horizontally, such that the width of each chrominance components is half the width of the luminance component; 4:1:1: 4 subsampling of the colour difference signals in the horizontal direction, it is not used extensively; 4:2:0: Both chrominance components are subsampled horizontally as well as vertically, such that the width and height of each chrominance components are half the width and height of the luminance component;