Sensors and Sensing Cameras and Camera Calibration

Similar documents
IMAGE FORMATION. Light source properties. Sensor characteristics Surface. Surface reflectance properties. Optics

Overview. Pinhole camera model Projective geometry Vanishing points and lines Projection matrix Cameras with Lenses Color Digital image

Image Processing & Projective geometry

Digital Image Processing

Achim J. Lilienthal Mobile Robotics and Olfaction Lab, AASS, Örebro University

Colorado School of Mines. Computer Vision. Professor William Hoff Dept of Electrical Engineering &Computer Science.

Unit 1: Image Formation

MEM455/800 Robotics II/Advance Robotics Winter 2009

Color Image Processing

Computer Vision. The Pinhole Camera Model

Digital Image Processing

Image Acquisition Hardware. Image Acquisition and Representation. CCD Camera. Camera. how digital images are produced

Image Acquisition and Representation. Camera. CCD Camera. Image Acquisition Hardware

ME 6406 MACHINE VISION. Georgia Institute of Technology

General Imaging System

Projection. Announcements. Müller-Lyer Illusion. Image formation. Readings Nalwa 2.1

Image Processing for feature extraction

Image Acquisition and Representation

Image Acquisition and Representation. Image Acquisition Hardware. Camera. how digital images are produced how digital images are represented

Filtering in the spatial domain (Spatial Filtering)

Image Processing. Adam Finkelstein Princeton University COS 426, Spring 2019

LENSES. INEL 6088 Computer Vision

Midterm Examination CS 534: Computational Photography

Cameras. CSE 455, Winter 2010 January 25, 2010

ELEC Dr Reji Mathew Electrical Engineering UNSW

Acquisition. Some slides from: Yung-Yu Chuang (DigiVfx) Jan Neumann, Pat Hanrahan, Alexei Efros

Image Formation. Dr. Gerhard Roth. COMP 4102A Winter 2015 Version 3

Computer Vision Slides curtesy of Professor Gregory Dudek

Cameras. Outline. Pinhole camera. Camera trial #1. Pinhole camera Film camera Digital camera Video camera

Computing for Engineers in Python

CS 376b Computer Vision

Image Formation and Capture. Acknowledgment: some figures by B. Curless, E. Hecht, W.J. Smith, B.K.P. Horn, and A. Theuwissen

Lecture 02 Image Formation 1

Chapters 1 & 2. Definitions and applications Conceptual basis of photogrammetric processing

Image Formation. World Optics Sensor Signal. Computer Vision. Introduction to. Light (Energy) Source. Surface Imaging Plane. Pinhole Lens.

Camera Calibration Certificate No: DMC III 27542

Digital Image Processing. Lecture # 8 Color Processing

Image acquisition. In both cases, the digital sensing element is one of the following: Line array Area array. Single sensor

Digital Imaging Rochester Institute of Technology

Computational Approaches to Cameras

Advanced Camera and Image Sensor Technology. Steve Kinney Imaging Professional Camera Link Chairman

Introduction. Lighting

Projection. Readings. Szeliski 2.1. Wednesday, October 23, 13

Sensors and Sensing Force, Torque, Tactile and Olfaction

Image Formation. Dr. Gerhard Roth. COMP 4102A Winter 2014 Version 1

CS534 Introduction to Computer Vision. Linear Filters. Ahmed Elgammal Dept. of Computer Science Rutgers University

>>> from numpy import random as r >>> I = r.rand(256,256);

Sensors and Sensing Force, Torque, Tactile and Olfaction

Assignment: Light, Cameras, and Image Formation

Image Filtering. Median Filtering

Projection. Projection. Image formation. Müller-Lyer Illusion. Readings. Readings. Let s design a camera. Szeliski 2.1. Szeliski 2.

EXPERIMENT ON PARAMETER SELECTION OF IMAGE DISTORTION MODEL

Visual perception basics. Image aquisition system. IE PŁ P. Strumiłło

Image Processing by Bilateral Filtering Method

Color Transformations

Capturing Light in man and machine

Design of a digital holographic interferometer for the. ZaP Flow Z-Pinch

Achim J. Lilienthal Mobile Robotics and Olfaction Lab, AASS, Örebro University

Filtering Images in the Spatial Domain Chapter 3b G&W. Ross Whitaker (modified by Guido Gerig) School of Computing University of Utah

Image Formation and Capture

10. Noise modeling and digital image filtering

Dr F. Cuzzolin 1. September 29, 2015

Image analysis. CS/CME/BioE/Biophys/BMI 279 Oct. 31 and Nov. 2, 2017 Ron Dror

CPSC 4040/6040 Computer Graphics Images. Joshua Levine

Cameras. Digital Visual Effects, Spring 2008 Yung-Yu Chuang 2008/2/26. with slides by Fredo Durand, Brian Curless, Steve Seitz and Alexei Efros

A Geometric Correction Method of Plane Image Based on OpenCV

CS559: Computer Graphics. Lecture 2: Image Formation in Eyes and Cameras Li Zhang Spring 2008

ON THE CREATION OF PANORAMIC IMAGES FROM IMAGE SEQUENCES

Bettina Selig. Centre for Image Analysis. Swedish University of Agricultural Sciences Uppsala University

Image and Multidimensional Signal Processing

Capturing Light in man and machine. Some figures from Steve Seitz, Steve Palmer, Paul Debevec, and Gonzalez et al.

The Xiris Glossary of Machine Vision Terminology

Continued. Introduction to Computer Vision CSE 252a Lecture 11

CSE 527: Introduction to Computer Vision

Vision Review: Image Processing. Course web page:

G1 THE NATURE OF EM WAVES AND LIGHT SOURCES

Capturing Light in man and machine

Image Enhancement in spatial domain. Digital Image Processing GW Chapter 3 from Section (pag 110) Part 2: Filtering in spatial domain

DIGITAL IMAGE PROCESSING UNIT III

VC 16/17 TP4 Colour and Noise

PolarCam and Advanced Applications

Image Processing - Intro. Tamás Szirányi

1. Draw the Ray Diagram, name lens or mirror shown and determine the SALT for each picture

Digitization and fundamental techniques

VC 16/17 TP2 Image Formation

Practical Image and Video Processing Using MATLAB

Computer Graphics. Si Lu. Fall er_graphics.htm 10/02/2015

Fig Color spectrum seen by passing white light through a prism.

A Simple Camera Model

Color , , Computational Photography Fall 2017, Lecture 11

Image Enhancement using Histogram Equalization and Spatial Filtering

Patents of eye tracking system- a survey

Digital Image Processing

HDR videos acquisition

Midterm is on Thursday!

VC 11/12 T2 Image Formation

Be aware that there is no universal notation for the various quantities.

Computer Vision. Howie Choset Introduction to Robotics

X-RAY COMPUTED TOMOGRAPHY

Oversubscription. Sorry, not fixed yet. We ll let you know as soon as we can.

Transcription:

Sensors and Sensing Cameras and Camera Calibration Todor Stoyanov Mobile Robotics and Olfaction Lab Center for Applied Autonomous Sensor Systems Örebro University, Sweden todor.stoyanov@oru.se 20.11.2014 T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 1 / 24

Outline 1 Camera Models 2 Camera Calibration 3 Color, Infrared and Thermal Cameras 4 Image Noise and Filters 5 Practice: Custom Camera Systems T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 2 / 24

Camera Models Outline 1 Camera Models 2 Camera Calibration 3 Color, Infrared and Thermal Cameras 4 Image Noise and Filters 5 Practice: Custom Camera Systems T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 2 / 24

Camera Models Electromagnetic Spectrum Cameras are passive devices that measure electromagnetic radiation, reflected by objects in the environment. Conventional cameras detect light in the visible range of the electromagnetic spectrum: wavelengths between 430nm-790nm.... but plenty of cameras built for other ranges: e.g., infrared, thermal, UV. Wavelength (m) Radiation type Radio Microwave Infrared Visible Ultraviolet X-ray Gamma T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 2 / 24

Camera Models CCD Sensors, Optics and Shutters Modern cameras consist of a light sensitive element, a lens and (optionally) a shutter. The light sensor is usually implemented as a Charge Coupled Device (CCD) printed on a CMOS chip. Lenses focus light onto the CCD array. Mechanical shutters can be used to only expose the chip for a short period of time. Electronic shutters are often used instead. T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 3 / 24

Camera Models Digital Images A digital image I can be thought of as a function f (x,y) : X Y Z, where X = [0,P x ] N and Y = [0,P y ] N are pixel coordinates in the image plane. Depending on the type of image Z can be: Binary image if Z = {0,1} Gray scale image if Z R Color image if Z R 3 Each pixel in the image corresponds to a single cell of the CCD array. T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 4 / 24

Camera Models Digital Images A digital image I can be thought of as a function f (x,y) : X Y Z, where X = [0,P x ] N and Y = [0,P y ] N are pixel coordinates in the image plane. Depending on the type of image Z can be: Binary image if Z = {0,1} Gray scale image if Z R Color image if Z R 3 Each pixel in the image corresponds to a single cell of the CCD array. T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 4 / 24

Camera Models The Pinhole Camera A pinhole camera is a simple camera without a lens that projects light directly on an image plane. The pinhole camera model can be extended to model complex cameras Some definitions: T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 5 / 24

Camera Models The Pinhole Camera The pinhole camera model can be extended to model complex cameras Some definitions: T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 5 / 24

Camera Models The Pinhole Camera image plane camera center principal axis The pinhole camera model can be extended to model complex cameras Some definitions: T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 5 / 24

Camera Models The Pinhole Camera Camera center: the focal point of all rays converging to the camera Principal axis: by definition this is the Ẑ axis pointing out of the camera center Image plane: the CCD plane where the image is acquired Focal length f : the vector pointing from the camera center to the image plane camera center principal axis image plane T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 6 / 24

Camera Models The Pinhole Camera Given a point s = (x,y,z,1) T in world coordinate frame (homogeneous coordinates), we can obtain the projection of the point to a corresponding pixel (x u,y u ) on the image plane as: x u y u 1 = f x 0 c x 0 f y cy 0 0 1 H x y z 1 (1) = KHs (2) T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 6 / 24

Camera Models Lens Distortion 1 In practice, adding a lens to the system adds several different types of distortion radial distortion is due to imperfections of the lens curvature tangential distortion is due to imperfect alignment of the lens center and the principle axis other types of distortion are more difficult to model. 1 http://www.uni-koeln.de/~al001/radcor_files/hs100.htm T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 7 / 24

Camera Models Modeling Distortion Going back to Eq. 1, consider the projection of s = (x,y,z,1) T onto the image plane: x x/z s = y = y/z (3) 1 1 We define r = x 2 + y 2 as the radius of the projected point, relative to the principal point. The undistorted point ŝ can then be computed as: x (1 + k 1 r 2 + k 2 r 4 ) + 2p 1 x y + p 2 (r 2 + 2x 2 ) ŝ = y (1 + k 1 r 2 + k 2 r 4 ) + p 1 (r 2 + 2y 2 ) + 2p 2 x y 1 (4) The undistorted pixel coordinates of s are then obtained as: x u y u 1 = Kŝ (5) T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 8 / 24

Camera Calibration Outline 1 Camera Models 2 Camera Calibration 3 Color, Infrared and Thermal Cameras 4 Image Noise and Filters 5 Practice: Custom Camera Systems T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 9 / 24

Camera Calibration 2D/3D Calibration Patterns 2 In order to determine the focal length and center offsets for the camera matrix K, the radial distortion coefficients k 1,k 2 and the tangential distortions p 1,p 2 cameras are calibrated. Calibration from a natural scene is not easily done, so we use calibration patterns 2D patterns: known pattern printed on a plane rarely 3D pattern: known 3D geometry 2 ROS camera calibration tutorial T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 9 / 24

Camera Calibration Chessboard calibration: basics The number of chessboard squares and their size is known in advance. We fix the world reference frame to the top-left corner of the board. All points lie on a plane in world frame, with z = 0. This means we can drop one column of rotation coefficients from H Note: the following slides follow the derivations as shown here 3. 3 http://ais.informatik.uni-freiburg.de/teaching/ws10/ robotics2/pdfs/rob2-10-camera-calibration.pdf T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 10 / 24

Camera Calibration Chessboard calibration: formulation For a point (x,y,z,1) T we have : x u y u 1 = K r 11 r 12 r 13 t 1 r 21 r 22 r 23 t 2 r 31 r 32 r 33 t 3 0 0 0 1 x y z 1 (6) T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 11 / 24

Camera Calibration Chessboard calibration: formulation For a point (x,y,z,1) T we have, we set z=0, thus : x u y u 1 = K r 11 r 12 0 t 1 r 21 r 22 0 t 2 r 31 r 32 0 t 3 0 0 0 1 x y 0 1 (7) T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 11 / 24

Camera Calibration Chessboard calibration: formulation For a point (x,y,z,1) T we have, we set z=0, thus : x u y u 1 = f x 0 c x 0 f y cy 0 0 1 = H x y 1 r 11 r 12 t 1 r 21 r 22 t 2 r 31 r 32 t 3 x y 1 (8) (9) T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 11 / 24

Camera Calibration Chessboard calibration: formulation For a point (x,y,z,1) T we have, we set z=0, thus : x u y u 1 = f x 0 c x 0 f y cy 0 0 1 = H x y 1 H is called the homography matrix. Let: r 11 r 12 t 1 r 21 r 22 t 2 r 31 r 32 t 3 x y 1 (8) (9) H = (h 1,h 2,h 3 ) = K(r 1,r 2,t) (10) T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 11 / 24

Camera Calibration Chessboard calibration: formulation Knowing (h 1,h 2,h 3 ) = K(r 1,r 2,t) (11) we have that r 1 = K 1 h 1 and r 2 = K 1 h 2. We also know that r 1 and r 2 are columns from a rotation matrix, thus they form an orthonormal basis, i.e. r T 1 r 2 = 0 and r T 1 r 1 = r T 2 r 2 = 1. Therefore: and r T 1 r 2 = 0 (12) h T 1 K T K 1 h 2 = 0 (13) r T 1 r 1 = r T 2 r 2 (14) h T 1 K T K 1 h 1 = h T 2 K T K 1 h 2 (15) h T 1 K T K 1 h 1 h T 2 K T K 1 h 2 = 0 (16) T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 12 / 24

Camera Calibration Chessboard calibration: solving the problem Re-formulating equations 13 and 16 and unwrapping the coefficients of B = K T K 1 as b = (b 11,b 12,b 13,b 22,b 23,b 33 ) (B is symmetric) we can formulate Vb = 0 (17) where V holds the coefficients from H as in equations 13 and 16. As we know the relative positions of the points on the pattern, we can obtain V for an image and solve for b. The parameters of K can be obtained by Cholesky factorization of B = LL T Measurements are noisy, so we instead solve a least squares problem to minimize Vb T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 13 / 24

Camera Calibration Chessboard calibration: distortion parameters The previous derivations solve the problem for the camera matrix K, but ignore distortion. In order to solve for distortion, we need to formulate the re-projection error. The resulting non-linear optimization problem is usually solved in batch by using the Levenberg-Marquardt method and linearization around the solution for K at every iteration. Fortunately, you don t typically have to solve the optimization problem yourself. Just use one of the many toolboxes for calibration. T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 14 / 24

Camera Calibration Chessboard calibration: distortion parameters The previous derivations solve the problem for the camera matrix K, but ignore distortion. In order to solve for distortion, we need to formulate the re-projection error. The resulting non-linear optimization problem is usually solved in batch by using the Levenberg-Marquardt method and linearization around the solution for K at every iteration. Fortunately, you don t typically have to solve the optimization problem yourself. Just use one of the many toolboxes for calibration. T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 14 / 24

Color, Infrared and Thermal Cameras Outline 1 Camera Models 2 Camera Calibration 3 Color, Infrared and Thermal Cameras 4 Image Noise and Filters 5 Practice: Custom Camera Systems T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 15 / 24

Color, Infrared and Thermal Cameras Color Cameras The world is colorful! Color images are obtained by adding a filter designed for specific wavelengths to each pixel of the CCD array The filter pattern is called a Bayer filter. Typically, we have red, green and blue sensitive pixels. Raw pixel values often come as a stream and camera drivers perform de-bayering to obtain the corresponding RGB vector. T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 15 / 24

Color, Infrared and Thermal Cameras Color Cameras The world is colorful! Color images are obtained by adding a filter designed for specific wavelengths to each pixel of the CCD array The filter pattern is called a Bayer filter. Typically, we have red, green and blue sensitive pixels. Raw pixel values often come as a stream and camera drivers perform de-bayering to obtain the corresponding RGB vector. T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 15 / 24

Color, Infrared and Thermal Cameras Color Spaces Several different models for colors. RGB systems encode values in equal sized intervals for red, green and blue. HSV space encodes the color hue, saturation and value of every pixel with Different color spaces have advantages in different operations. Color spaces are not equivalent, but we can convert between representations H [0,360 ) S [0,1] V [0,1] T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 16 / 24

Color, Infrared and Thermal Cameras Infrared and Thermal Cameras Cameras sensitive to other parts of the EM spectrum IR cameras measure reflected IR light. Often used with an IR diode light source for night-time security applications. Thermal cameras can detect emitted IR light. T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 17 / 24

Color, Infrared and Thermal Cameras Infrared and Thermal Cameras Cameras sensitive to other parts of the EM spectrum IR cameras measure reflected IR light. Often used with an IR diode light source for night-time security applications. Thermal cameras can detect emitted IR light. T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 17 / 24

Image Noise and Filters Outline 1 Camera Models 2 Camera Calibration 3 Color, Infrared and Thermal Cameras 4 Image Noise and Filters 5 Practice: Custom Camera Systems T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 18 / 24

Image Noise and Filters Typical Noise in Images Apart from errors due to lens distortion, images usually corrupted by additional noise sources. Additive Gaussian noise (independent per pixel) Salt-and-pepper random noise Multiplicative shot noise Images may be post-processed to filter out noise. T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 18 / 24

Image Noise and Filters Typical Noise in Images Apart from errors due to lens distortion, images usually corrupted by additional noise sources. Additive Gaussian noise (independent per pixel) Salt-and-pepper random noise Multiplicative shot noise Images may be post-processed to filter out noise. T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 18 / 24

Image Noise and Filters Filters and Convolution Spatial domain image filters work by convolution of a filter kernel (or mask). A region of predefined size is slid over the image. Each element of the mask contains a weight The value of the filtered pixel p(x,y) = i,j w i,j p x+i,y+j Special care should be taken at the borders. T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 19 / 24

Image Noise and Filters Filters and Convolution Spatial domain image filters work by convolution of a filter kernel (or mask). A region of predefined size is slid over the image. Each element of the mask contains a weight The value of the filtered pixel p(x,y) = i,j w i,j p x+i,y+j Special care should be taken at the borders. T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 19 / 24

Image Noise and Filters Filters and Convolution Spatial domain image filters work by convolution of a filter kernel (or mask). A region of predefined size is slid over the image. Each element of the mask contains a weight The value of the filtered pixel p(x,y) = i,j w i,j p x+i,y+j Special care should be taken at the borders. 0.1 0.1 0.1 0.1 0.5 0.1 0.1 0.1 0.1 T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 19 / 24

Image Noise and Filters Mean filter Mean (average, box) filter places an equal weight for all elements of the kernel. i.e. w i.j = 1 ij Averaging blurs out both noise and details in the image. Generates defects, e.g. ringing, axis-aligned streaks. T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 20 / 24

Image Noise and Filters Mean filter Mean (average, box) filter places an equal weight for all elements of the kernel. i.e. w i.j = 1 ij Averaging blurs out both noise and details in the image. Generates defects, e.g. ringing, axis-aligned streaks. T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 20 / 24

Image Noise and Filters Gaussian filter A Gaussian filter uses as a kernel a normal distribution. Close by pixels contribute more to the final result. Blurs and smoothens images. T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 21 / 24

Image Noise and Filters Gaussian filter A Gaussian filter uses as a kernel a normal distribution. Close by pixels contribute more to the final result. Blurs and smoothens images. T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 21 / 24

Image Noise and Filters Median filter Non-linear filter. Substitute pixel p with the median of all pixels inside the filter kernel. e.g. for a 3x3 filter, sort values and take the 5th largest as the median. Less blurry, very good for removing salt and pepper noise. T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 22 / 24

Image Noise and Filters Median filter Non-linear filter. Substitute pixel p with the median of all pixels inside the filter kernel. e.g. for a 3x3 filter, sort values and take the 5th largest as the median. Less blurry, very good for removing salt and pepper noise. T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 22 / 24

Image Noise and Filters Bilateral filter The bilateral filter is an edge-preserving smoothing filter. Main idea: pixels are smoothed based on both spatial proximity (x, y coordinates) and the pixel values p. Two Gaussian kernels, one for spatial- and one for pixel-domain. T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 23 / 24

Image Noise and Filters Bilateral filter T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 23 / 24

Image Noise and Filters Bilateral filter T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 23 / 24

Practice: Custom Camera Systems Outline 1 Camera Models 2 Camera Calibration 3 Color, Infrared and Thermal Cameras 4 Image Noise and Filters 5 Practice: Custom Camera Systems T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 24 / 24

Practice: Custom Camera Systems The Vest Camera Note: Figures from [1]. T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 24 / 24

Practice: Custom Camera Systems Note: T. Stoyanov Figures (MRO Lab, from AASS) [1]. Sensors & Sensing 20.11.2014 24 / 24 The Vest Camera

Practice: Custom Camera Systems Sensors and Sensing Cameras and Camera Calibration Todor Stoyanov Mobile Robotics and Olfaction Lab Center for Applied Autonomous Sensor Systems Örebro University, Sweden todor.stoyanov@oru.se 20.11.2014 T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 24 / 24

Practice: Custom Camera Systems References [1] Rafael Mosberger and Henrik Andreasson. An inexpensive monocular vision system for tracking humans in industrial environments. In Proceedings of the International Conference on Robotics and Automation (ICRA), 2013. T. Stoyanov (MRO Lab, AASS) Sensors & Sensing 20.11.2014 24 / 24