Sensors and Sensing Cameras and Camera Calibration Todor Stoyanov Mobile Robotics and Olfaction Lab Center for Applied Autonomous Sensor Systems Örebro University, Sweden todor.stoyanov@oru.se 20.11.2014 T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 1 / 24
Outline 1 Camera Models 2 Camera Calibration 3 Color, Infrared and Thermal Cameras 4 Image Noise and Filters 5 Practice: Custom Camera Systems T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 2 / 24
Camera Models Outline 1 Camera Models 2 Camera Calibration 3 Color, Infrared and Thermal Cameras 4 Image Noise and Filters 5 Practice: Custom Camera Systems T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 2 / 24
Camera Models Electromagnetic Spectrum Cameras are passive devices that measure electromagnetic radiation, reflected by objects in the environment. Conventional cameras detect light in the visible range of the electromagnetic spectrum: wavelengths between 430nm-790nm.... but plenty of cameras built for other ranges: e.g., infrared, thermal, UV. Wavelength (m) Radiation type Radio Microwave Infrared Visible Ultraviolet X-ray Gamma T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 2 / 24
Camera Models CCD Sensors, Optics and Shutters Modern cameras consist of a light sensitive element, a lens and (optionally) a shutter. The light sensor is usually implemented as a Charge Coupled Device (CCD) printed on a CMOS chip. Lenses focus light onto the CCD array. Mechanical shutters can be used to only expose the chip for a short period of time. Electronic shutters are often used instead. T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 3 / 24
Camera Models Digital Images A digital image I can be thought of as a function f (x,y) : X Y Z, where X = [0,P x ] N and Y = [0,P y ] N are pixel coordinates in the image plane. Depending on the type of image Z can be: Binary image if Z = {0,1} Gray scale image if Z R Color image if Z R 3 Each pixel in the image corresponds to a single cell of the CCD array. T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 4 / 24
Camera Models Digital Images A digital image I can be thought of as a function f (x,y) : X Y Z, where X = [0,P x ] N and Y = [0,P y ] N are pixel coordinates in the image plane. Depending on the type of image Z can be: Binary image if Z = {0,1} Gray scale image if Z R Color image if Z R 3 Each pixel in the image corresponds to a single cell of the CCD array. T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 4 / 24
Camera Models The Pinhole Camera A pinhole camera is a simple camera without a lens that projects light directly on an image plane. The pinhole camera model can be extended to model complex cameras Some definitions: T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 5 / 24
Camera Models The Pinhole Camera The pinhole camera model can be extended to model complex cameras Some definitions: T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 5 / 24
Camera Models The Pinhole Camera image plane camera center principal axis The pinhole camera model can be extended to model complex cameras Some definitions: T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 5 / 24
Camera Models The Pinhole Camera Camera center: the focal point of all rays converging to the camera Principal axis: by definition this is the Ẑ axis pointing out of the camera center Image plane: the CCD plane where the image is acquired Focal length f : the vector pointing from the camera center to the image plane camera center principal axis image plane T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 6 / 24
Camera Models The Pinhole Camera Given a point s = (x,y,z,1) T in world coordinate frame (homogeneous coordinates), we can obtain the projection of the point to a corresponding pixel (x u,y u ) on the image plane as: x u y u 1 = f x 0 c x 0 f y cy 0 0 1 H x y z 1 (1) = KHs (2) T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 6 / 24
Camera Models Lens Distortion 1 In practice, adding a lens to the system adds several different types of distortion radial distortion is due to imperfections of the lens curvature tangential distortion is due to imperfect alignment of the lens center and the principle axis other types of distortion are more difficult to model. 1 http://www.uni-koeln.de/~al001/radcor_files/hs100.htm T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 7 / 24
Camera Models Modeling Distortion Going back to Eq. 1, consider the projection of s = (x,y,z,1) T onto the image plane: x x/z s = y = y/z (3) 1 1 We define r = x 2 + y 2 as the radius of the projected point, relative to the principal point. The undistorted point ŝ can then be computed as: x (1 + k 1 r 2 + k 2 r 4 ) + 2p 1 x y + p 2 (r 2 + 2x 2 ) ŝ = y (1 + k 1 r 2 + k 2 r 4 ) + p 1 (r 2 + 2y 2 ) + 2p 2 x y 1 (4) The undistorted pixel coordinates of s are then obtained as: x u y u 1 = Kŝ (5) T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 8 / 24
Camera Calibration Outline 1 Camera Models 2 Camera Calibration 3 Color, Infrared and Thermal Cameras 4 Image Noise and Filters 5 Practice: Custom Camera Systems T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 9 / 24
Camera Calibration 2D/3D Calibration Patterns 2 In order to determine the focal length and center offsets for the camera matrix K, the radial distortion coefficients k 1,k 2 and the tangential distortions p 1,p 2 cameras are calibrated. Calibration from a natural scene is not easily done, so we use calibration patterns 2D patterns: known pattern printed on a plane rarely 3D pattern: known 3D geometry 2 ROS camera calibration tutorial T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 9 / 24
Camera Calibration Chessboard calibration: basics The number of chessboard squares and their size is known in advance. We fix the world reference frame to the top-left corner of the board. All points lie on a plane in world frame, with z = 0. This means we can drop one column of rotation coefficients from H Note: the following slides follow the derivations as shown here 3. 3 http://ais.informatik.uni-freiburg.de/teaching/ws10/ robotics2/pdfs/rob2-10-camera-calibration.pdf T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 10 / 24
Camera Calibration Chessboard calibration: formulation For a point (x,y,z,1) T we have : x u y u 1 = K r 11 r 12 r 13 t 1 r 21 r 22 r 23 t 2 r 31 r 32 r 33 t 3 0 0 0 1 x y z 1 (6) T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 11 / 24
Camera Calibration Chessboard calibration: formulation For a point (x,y,z,1) T we have, we set z=0, thus : x u y u 1 = K r 11 r 12 0 t 1 r 21 r 22 0 t 2 r 31 r 32 0 t 3 0 0 0 1 x y 0 1 (7) T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 11 / 24
Camera Calibration Chessboard calibration: formulation For a point (x,y,z,1) T we have, we set z=0, thus : x u y u 1 = f x 0 c x 0 f y cy 0 0 1 = H x y 1 r 11 r 12 t 1 r 21 r 22 t 2 r 31 r 32 t 3 x y 1 (8) (9) T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 11 / 24
Camera Calibration Chessboard calibration: formulation For a point (x,y,z,1) T we have, we set z=0, thus : x u y u 1 = f x 0 c x 0 f y cy 0 0 1 = H x y 1 H is called the homography matrix. Let: r 11 r 12 t 1 r 21 r 22 t 2 r 31 r 32 t 3 x y 1 (8) (9) H = (h 1,h 2,h 3 ) = K(r 1,r 2,t) (10) T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 11 / 24
Camera Calibration Chessboard calibration: formulation Knowing (h 1,h 2,h 3 ) = K(r 1,r 2,t) (11) we have that r 1 = K 1 h 1 and r 2 = K 1 h 2. We also know that r 1 and r 2 are columns from a rotation matrix, thus they form an orthonormal basis, i.e. r T 1 r 2 = 0 and r T 1 r 1 = r T 2 r 2 = 1. Therefore: and r T 1 r 2 = 0 (12) h T 1 K T K 1 h 2 = 0 (13) r T 1 r 1 = r T 2 r 2 (14) h T 1 K T K 1 h 1 = h T 2 K T K 1 h 2 (15) h T 1 K T K 1 h 1 h T 2 K T K 1 h 2 = 0 (16) T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 12 / 24
Camera Calibration Chessboard calibration: solving the problem Re-formulating equations 13 and 16 and unwrapping the coefficients of B = K T K 1 as b = (b 11,b 12,b 13,b 22,b 23,b 33 ) (B is symmetric) we can formulate Vb = 0 (17) where V holds the coefficients from H as in equations 13 and 16. As we know the relative positions of the points on the pattern, we can obtain V for an image and solve for b. The parameters of K can be obtained by Cholesky factorization of B = LL T Measurements are noisy, so we instead solve a least squares problem to minimize Vb T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 13 / 24
Camera Calibration Chessboard calibration: distortion parameters The previous derivations solve the problem for the camera matrix K, but ignore distortion. In order to solve for distortion, we need to formulate the re-projection error. The resulting non-linear optimization problem is usually solved in batch by using the Levenberg-Marquardt method and linearization around the solution for K at every iteration. Fortunately, you don t typically have to solve the optimization problem yourself. Just use one of the many toolboxes for calibration. T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 14 / 24
Camera Calibration Chessboard calibration: distortion parameters The previous derivations solve the problem for the camera matrix K, but ignore distortion. In order to solve for distortion, we need to formulate the re-projection error. The resulting non-linear optimization problem is usually solved in batch by using the Levenberg-Marquardt method and linearization around the solution for K at every iteration. Fortunately, you don t typically have to solve the optimization problem yourself. Just use one of the many toolboxes for calibration. T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 14 / 24
Color, Infrared and Thermal Cameras Outline 1 Camera Models 2 Camera Calibration 3 Color, Infrared and Thermal Cameras 4 Image Noise and Filters 5 Practice: Custom Camera Systems T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 15 / 24
Color, Infrared and Thermal Cameras Color Cameras The world is colorful! Color images are obtained by adding a filter designed for specific wavelengths to each pixel of the CCD array The filter pattern is called a Bayer filter. Typically, we have red, green and blue sensitive pixels. Raw pixel values often come as a stream and camera drivers perform de-bayering to obtain the corresponding RGB vector. T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 15 / 24
Color, Infrared and Thermal Cameras Color Cameras The world is colorful! Color images are obtained by adding a filter designed for specific wavelengths to each pixel of the CCD array The filter pattern is called a Bayer filter. Typically, we have red, green and blue sensitive pixels. Raw pixel values often come as a stream and camera drivers perform de-bayering to obtain the corresponding RGB vector. T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 15 / 24
Color, Infrared and Thermal Cameras Color Spaces Several different models for colors. RGB systems encode values in equal sized intervals for red, green and blue. HSV space encodes the color hue, saturation and value of every pixel with Different color spaces have advantages in different operations. Color spaces are not equivalent, but we can convert between representations H [0,360 ) S [0,1] V [0,1] T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 16 / 24
Color, Infrared and Thermal Cameras Infrared and Thermal Cameras Cameras sensitive to other parts of the EM spectrum IR cameras measure reflected IR light. Often used with an IR diode light source for night-time security applications. Thermal cameras can detect emitted IR light. T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 17 / 24
Color, Infrared and Thermal Cameras Infrared and Thermal Cameras Cameras sensitive to other parts of the EM spectrum IR cameras measure reflected IR light. Often used with an IR diode light source for night-time security applications. Thermal cameras can detect emitted IR light. T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 17 / 24
Image Noise and Filters Outline 1 Camera Models 2 Camera Calibration 3 Color, Infrared and Thermal Cameras 4 Image Noise and Filters 5 Practice: Custom Camera Systems T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 18 / 24
Image Noise and Filters Typical Noise in Images Apart from errors due to lens distortion, images usually corrupted by additional noise sources. Additive Gaussian noise (independent per pixel) Salt-and-pepper random noise Multiplicative shot noise Images may be post-processed to filter out noise. T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 18 / 24
Image Noise and Filters Typical Noise in Images Apart from errors due to lens distortion, images usually corrupted by additional noise sources. Additive Gaussian noise (independent per pixel) Salt-and-pepper random noise Multiplicative shot noise Images may be post-processed to filter out noise. T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 18 / 24
Image Noise and Filters Filters and Convolution Spatial domain image filters work by convolution of a filter kernel (or mask). A region of predefined size is slid over the image. Each element of the mask contains a weight The value of the filtered pixel p(x,y) = i,j w i,j p x+i,y+j Special care should be taken at the borders. T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 19 / 24
Image Noise and Filters Filters and Convolution Spatial domain image filters work by convolution of a filter kernel (or mask). A region of predefined size is slid over the image. Each element of the mask contains a weight The value of the filtered pixel p(x,y) = i,j w i,j p x+i,y+j Special care should be taken at the borders. T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 19 / 24
Image Noise and Filters Filters and Convolution Spatial domain image filters work by convolution of a filter kernel (or mask). A region of predefined size is slid over the image. Each element of the mask contains a weight The value of the filtered pixel p(x,y) = i,j w i,j p x+i,y+j Special care should be taken at the borders. 0.1 0.1 0.1 0.1 0.5 0.1 0.1 0.1 0.1 T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 19 / 24
Image Noise and Filters Mean filter Mean (average, box) filter places an equal weight for all elements of the kernel. i.e. w i.j = 1 ij Averaging blurs out both noise and details in the image. Generates defects, e.g. ringing, axis-aligned streaks. T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 20 / 24
Image Noise and Filters Mean filter Mean (average, box) filter places an equal weight for all elements of the kernel. i.e. w i.j = 1 ij Averaging blurs out both noise and details in the image. Generates defects, e.g. ringing, axis-aligned streaks. T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 20 / 24
Image Noise and Filters Gaussian filter A Gaussian filter uses as a kernel a normal distribution. Close by pixels contribute more to the final result. Blurs and smoothens images. T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 21 / 24
Image Noise and Filters Gaussian filter A Gaussian filter uses as a kernel a normal distribution. Close by pixels contribute more to the final result. Blurs and smoothens images. T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 21 / 24
Image Noise and Filters Median filter Non-linear filter. Substitute pixel p with the median of all pixels inside the filter kernel. e.g. for a 3x3 filter, sort values and take the 5th largest as the median. Less blurry, very good for removing salt and pepper noise. T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 22 / 24
Image Noise and Filters Median filter Non-linear filter. Substitute pixel p with the median of all pixels inside the filter kernel. e.g. for a 3x3 filter, sort values and take the 5th largest as the median. Less blurry, very good for removing salt and pepper noise. T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 22 / 24
Image Noise and Filters Bilateral filter The bilateral filter is an edge-preserving smoothing filter. Main idea: pixels are smoothed based on both spatial proximity (x, y coordinates) and the pixel values p. Two Gaussian kernels, one for spatial- and one for pixel-domain. T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 23 / 24
Image Noise and Filters Bilateral filter T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 23 / 24
Image Noise and Filters Bilateral filter T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 23 / 24
Practice: Custom Camera Systems Outline 1 Camera Models 2 Camera Calibration 3 Color, Infrared and Thermal Cameras 4 Image Noise and Filters 5 Practice: Custom Camera Systems T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 24 / 24
Practice: Custom Camera Systems The Vest Camera Note: Figures from [1]. T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 24 / 24
Practice: Custom Camera Systems Note: T. Stoyanov Figures (MRO Lab, from AASS) [1]. Sensors & Sensing 20.11.2014 24 / 24 The Vest Camera
Practice: Custom Camera Systems Sensors and Sensing Cameras and Camera Calibration Todor Stoyanov Mobile Robotics and Olfaction Lab Center for Applied Autonomous Sensor Systems Örebro University, Sweden todor.stoyanov@oru.se 20.11.2014 T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 24 / 24
Practice: Custom Camera Systems References [1] Rafael Mosberger and Henrik Andreasson. An inexpensive monocular vision system for tracking humans in industrial environments. In Proceedings of the International Conference on Robotics and Automation (ICRA), 2013. T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 24 / 24