Cvision 2. António J. R. Neves João Paulo Silva Cunha. Bernardo Cunha. IEETA / Universidade de Aveiro

Similar documents
VC 14/15 TP2 Image Formation

VC 11/12 T2 Image Formation

VC 16/17 TP2 Image Formation

Lecture 8. Color Image Processing

Colors in Images & Video

Image acquisition. In both cases, the digital sensing element is one of the following: Line array Area array. Single sensor

LECTURE 07 COLORS IN IMAGES & VIDEO

Computer Vision. Doctoral Program in Computer Science (MAPi) Hélder Filipe Pinto de Oliveira

Color images C1 C2 C3

DIGITAL IMAGE PROCESSING (COM-3371) Week 2 - January 14, 2002

Color image processing

Digital Image Processing. Lecture # 8 Color Processing

Digital Image Processing Color Models &Processing

It should also be noted that with modern cameras users can choose for either

Lecture Notes 11 Introduction to Color Imaging

Computers and Imaging

CS 548: Computer Vision REVIEW: Digital Image Basics. Spring 2016 Dr. Michael J. Reale

Images. CS 4620 Lecture Kavita Bala w/ prior instructor Steve Marschner. Cornell CS4620 Fall 2015 Lecture 38

A High-Speed Imaging Colorimeter LumiCol 1900 for Display Measurements

ECC419 IMAGE PROCESSING

Wireless Communication

CS6640 Computational Photography. 6. Color science for digital photography Steve Marschner

12/02/2017. From light to colour spaces. Electromagnetic spectrum. Colour. Correlated colour temperature. Black body radiation.

Imaging Process (review)

Multimedia Systems Color Space Mahdi Amiri March 2012 Sharif University of Technology

To discuss. Color Science Color Models in image. Computer Graphics 2

Color Science. What light is. Measuring light. CS 4620 Lecture 15. Salient property is the spectral power distribution (SPD)

Cameras CS / ECE 181B

Acquisition. Some slides from: Yung-Yu Chuang (DigiVfx) Jan Neumann, Pat Hanrahan, Alexei Efros

Fundamental concepts of processing and image analysis

Mahdi Amiri. March Sharif University of Technology

CS 565 Computer Vision. Nazar Khan PUCIT Lecture 4: Colour

For a long time I limited myself to one color as a form of discipline. Pablo Picasso. Color Image Processing

Capturing Light in man and machine. Some figures from Steve Seitz, Steve Palmer, Paul Debevec, and Gonzalez et al.

Image and Video Processing

Color , , Computational Photography Fall 2017, Lecture 11

Color , , Computational Photography Fall 2018, Lecture 7

Introduction to Computer Vision

COLOR. and the human response to light

Bettina Selig. Centre for Image Analysis. Swedish University of Agricultural Sciences Uppsala University

Graphics and Image Processing Basics

Announcements. The appearance of colors

COLOR and the human response to light

Image Processing: An Overview

SCD-0017 Firegrab Documentation

Image Processing. Michael Kazhdan ( /657) HB Ch FvDFH Ch. 13.1

Image Processing - Intro. Tamás Szirányi

Color Image Processing

Color vision and representation

Realistic Image Synthesis

The human visual system

Camera, video production. TNGD10 - Moving media

White Paper High Dynamic Range Imaging

NFMS THEORY LIGHT AND COLOR MEASUREMENTS AND THE CCD-BASED GONIOPHOTOMETER. Presented by: January, 2015 S E E T H E D I F F E R E N C E

Reading. Foley, Computer graphics, Chapter 13. Optional. Color. Brian Wandell. Foundations of Vision. Sinauer Associates, Sunderland, MA 1995.

Color Science. CS 4620 Lecture 15

Fig Color spectrum seen by passing white light through a prism.

Lecture Color Image Processing. by Shahid Farid

Digital Image Processing. Lecture # 6 Corner Detection & Color Processing

CS 4300 Computer Graphics. Prof. Harriet Fell Fall 2012 Lecture 4 September 12, 2012

Light. intensity wavelength. Light is electromagnetic waves Laser is light that contains only a narrow spectrum of frequencies

Understand brightness, intensity, eye characteristics, and gamma correction, halftone technology, Understand general usage of color

Figure 1 HDR image fusion example

A simulation tool for evaluating digital camera image quality

Image Formation: Camera Model

Unit 1: Image Formation

Solution Set #2

A.V.C. COLLEGE OF ENGINEERING DEPARTEMENT OF CSE CP7004- IMAGE PROCESSING AND ANALYSIS UNIT 1- QUESTION BANK

Chapter 2: Digital Image Fundamentals. Digital image processing is based on. Mathematical and probabilistic models Human intuition and analysis

Image and Multidimensional Signal Processing

General Imaging System

How does prism technology help to achieve superior color image quality?

Color images C1 C2 C3

Digital Imaging Rochester Institute of Technology

Color. Used heavily in human vision. Color is a pixel property, making some recognition problems easy

Unit 8: Color Image Processing

Introduction to Multimedia Computing

Lecture 2 Digital Image Fundamentals. Lin ZHANG, PhD School of Software Engineering Tongji University Fall 2016

Visual Perception. human perception display devices. CS Visual Perception

Digital Image Processing

The Science Seeing of process Digital Media. The Science of Digital Media Introduction

Considerations of HDR Program Origination

Visibility of Uncorrelated Image Noise

Color Image Processing

Comparing Sound and Light. Light and Color. More complicated light. Seeing colors. Rods and cones

Image Formation and Capture

Computer Graphics Si Lu Fall /27/2016

Images and Displays. Lecture Steve Marschner 1

Image and video processing (EBU723U) Colour Images. Dr. Yi-Zhe Song

Digital Image Processing

Color Computer Vision Spring 2018, Lecture 15

Cameras. CSE 455, Winter 2010 January 25, 2010

CS559: Computer Graphics. Lecture 2: Image Formation in Eyes and Cameras Li Zhang Spring 2008

Digital photography , , Computational Photography Fall 2017, Lecture 2

Color & Compression. Robin Strand Centre for Image analysis Swedish University of Agricultural Sciences Uppsala University

Digital Image Processing Lec 02 - Image Formation - Color Space

Introduction. Prof. Lina Karam School of Electrical, Computer, & Energy Engineering Arizona State University

Color Reproduction. Chapter 6

VC 16/17 TP4 Colour and Noise

Visual Perception. Jeff Avery

Transcription:

Cvision 2 Digital Imaging António J. R. Neves (an@ua.pt) & João Paulo Silva Cunha & Bernardo Cunha IEETA / Universidade de Aveiro

Outline Image sensors Camera calibration Sampling and quantization Data structures for digital images Histograms Acknowledgements: Most of this course is based on the excellent courses offered by Prof. Shree Nayar at Columbia University, USA and by Prof. Srinivasa Narasimhan at CMU, USA. This was also based on Prof. Miguel Coimbra s slides. Please acknowledge the original source when reusing these slides for academic purposes. 2

Topic: Image Sensors Image sensors Camera Calibration Sampling and quantization Data structures for digital images Histograms 3

Image Sensors Considerations Speed Resolution Signal / Noise Ratio Cost 4

Image Sensors Convert light into an electric charge CCD (charge coupled device) Higher dynamic range High uniformity Lower noise CMOS (complementary metal Oxide semiconductor) Lower voltage Higher speed Lower system complexity 5

CCD Performance Characteristics Linearity Principle: Incoming photon flux vs. Output Signal Sometimes cameras are made non-linear on purpose. Calibration must be done (using reflectance charts) Dark Current Noise: Non-zero output signal when incoming light is zero Sensitivity: Minimum detectable signal produced by camera 6

Sensing Brightness Incoming light has a spectral distribution p( λ) So the pixel intensity becomes I ( λ) p( λ) = k q dλ 7

How do we sense colour? Do we have infinite number of filters? rod cones Three filters of different spectral responses 8

Sensing Colour Tristimulus (trichromatic) values ( I, I, I ) Camera s spectral response functions: R R ( λ), h ( λ) h ( λ) h, G B G B h B ( λ) h G ( λ) h R ( λ) I I R G ( λ) p( λ) = k h dλ R ( λ ) p ( λ ) = k h d λ G I B ( λ) p( λ) = k h dλ B 9

Sensing Colour 3 CCD light beam splitter Foveon X3 TM Bayer pattern 10

Several types of cameras an@ua.pt 11

Several types of cameras Several interfaces (Firewire, GigE, CameraLink, USB,...). Scientific usage (high resolution, long exposure time,...). High speed (ex. 1000 fps). Linear (ex. 10000 lines per second). 3D Infrared (ex. 8 to 14 µm). High dynamic range (ex. using a prism and two sensors). Multispectral an@ua.pt 12

Topic: Camera Calibration Image sensors Camera Calibration Sampling and quantization Data structures for digital images Histograms 13

Definitions - Luminance Luminance Luminance is normally defined as a measurement of the photometric luminous intensity per unit area of light travelling in a given direction. Therefore it is used to describe the amount of light that goes through, or is emitted from, a particular area, and falls within a given solid angle. The SI unit for luminance is candela per square meter (cd/m2). The CGS unit of luminance is the stilb, which is equal to one candela per square centimeter or 10 kcd/m2.

Definitions - Chrominance Chrominance Chrominance is a numeral that describes the way a certain amount of light is distributed among the visible spectrum. A black and white image has a balanced distribution of energy among to the visible spectrum matched to the band pass characteristics of the human visual system. This means that when viewed by a human a B&W image has no color information which means that its color information is zero. Therefore, chrominance has no luminance information but is used together with it to describe a colored image defined, for instance, by an RGB triplet. Any RGB triplet in which the value of R=G=B has no chrominance information.

RGB & YUV Separating Luminance from Chrominance Given an RGB triplet, we can define a derived triplet in which luminance and chrominance can be separated: Luminance Y = W R + W G + W B where U V r B Y = U max 1 W R Y = V max 1 W Wr = 0. 299 WB = 0. 114 WG = 0. 587 Umax = 0. 436 V = 0. 615 max g r b b 0.492( B Y ) 0.877( R Y ) Chrominance This values originally derivates from the general model of the human visual system and had a significant impact on the ability to develop a television color system compatible with the previous B&W television systems. A symetric operation can be performed in order to recover the original RGB triple.

The image processing pipeline Image processing pipeline A typical image processing pipeline (inside the image device) for a tri-stimulus A typical image processing pipeline (inside the image device) for a tri-stimulus system is shown bellow. This processing can be performed on the YUV or RGB components depending on the system. This should be understood as a mere example.

The image processing pipeline Image processing pipeline Depending on the system, more or less image parameters may be available for Depending on the system, more or less image parameters may be available for the user to control. Also, some of these parameters (namely brightness, contrast and saturation) are also intrinsic original image characteristics apart from being externally controllable parameters.

Brightness Brightness (as an intrinsic image characteristic) Brightness is one on the intrinsic original image characteristics. It represents a measure of the average amount of light that is integrated over the image during the exposure time. Exposure time (that is, the period of time during which the sensor receives light while forming the image, may or may not be a controllable parameter of the image device). If the brightness it too high overexposure may occur which will white saturate part or the totality of the image.

Brightness Brightness (as a controllable parameter) The brightness parameter is basically a constant (or offset) that can be added (subtracted) from the luminance component of the image. Output Input

Contrast Contrast (as an intrinsic image characteristic) There is not a unique definition of contrast. On of the most used is that contrast is the difference in luminance (or color) along the 2D space that makes an object distinguishable. In visual perception of the real world, contrast is determined by the difference in the color and brightness of the object and other objects within the same field of view. The faster and higher the luminance (or color) changes along the space the higher the contrast is. The maximum possible contrast of an image is also denominated contrast ratio or dynamic range.

Contrast Contrast (as an intrinsic image characteristic) One of the possible definitions of contrast is given by the expression Luminance diference Average luminance The human eye contrast sensitivity function is a typical band-pass filter with a maximum at around 4 cycles per degree with sensitivity reducing to both sides off that maximum. This means that the human visual system can detect lower contrast differences at 4 cycles per degree than at any other spatial frequency.

Contrast Contrast (as a controllable parameter) The contrast parameter is basically a variation in the gain control function of the luminance component of the image. Output Input

Contrast + Brightness Contrast + Brightness (as controllable parameters) It is common that contrat and brightness are actually a combined single transfer function. Output Input

White Balance White Balance(as controllable parameters) White balance is the global adjustment of the intensities of the colors (typically red, green, and blue primary colors). An important goal of this adjustment is to render specific colors particularly neutral colors correctly; hence, the general method is sometimes called gray balance, neutral balance, or white balance. This balance is required because of different color spectrum energy distribution depending on the illumination source.

White Balance White Balance Examples

Saturation Saturation (as an intrinsic image characteristic) The saturation of a color is determined by a combination of light intensity that is acquired by a pixel and how much this light it is distributed across the spectrum of different wavelengths. The most purest (most saturated) color is obtained when using a single wavelength at a high intensity (laser light is a good example). If the light intensity declines, then, as a result, the saturation also decline. A non saturated image (B&W) has a spectrum distribution that matches the human eye spectrum sensibility. Saturation is sometimes also defined as the amount of white you have blended into a pure color.

Saturation Saturation (as a controllable parameter) To reduce the saturation of an image we can add white to the original colors. In fact this is the same as changing the gain of the U and V chromatic components.

Gamma Gamma Gamma correction is the name of a nonlinear operation used to code and decode luminance or RGB tristimulus values. In the simplest cases gamma is defined by the power-law expression: δ V out = AV in where A is a constant and the input and output values are non-negative real values. In most cases A = 1, and inputs and outputs are typically in the range 0 1.

Gamma Gamma Examples

Sharpness Sharpness (as a controllable parameter) Sharpness is a measure of the energy frequency spatial distribution over the image. Not all devices provide access to this parameter. Sharpness basically allows the control of the cut-off frequency of a low pass spatial filter. This may be very useful if the image is afterward intended to be decimated, since it allows to prevent spatial aliases artifacts.

Sharpness Sharpness (as a controllable parameter) Examples.

Topic: Sampling and quantization Image sensors Camera Calibration Sampling and quantization Data structures for digital images Histograms 33

Components of a Computer Vision System Camera Lighting Computer Scene Scene Interpretation 34

Digital Images What we see What a computer sees 35

Simple Image Model Image as a 2D light- intensity function f ( x, y) Continuous Non-zero, finite value 0 < f ( x, y) < Intensity Position [Gonzalez & Woods] 36

Analog to Digital The scene is: projected on a 2D plane, sampled on a regular grid, and each sample is quantized (rounded to the nearest integer) f ( i, j ) = Quantize{ f ( i, j ) } 37

Images as Matrices Each point is a pixel with amplitude: f(x,y) An image is a matrix with size N x M M = [(0,0) (0,1) [(1,0) (1,1) (M-1,0) (0,0) (0,N-1) Pixel 38

Sampling Theorem f ( x) Continuous signal: Shah function (Impulse train): s s( x) = δ ( x nx0 ) ( x) n= x x 0 x f s 0 n= Sampled function: ( x) = f ( x) s( x) = f ( x) δ( x nx ) 39

Quantization Analog: 0 < f ( x, y) < Digital: Infinite storage space per pixel! Quantization 40

Quantization Levels G - number of levels m storage bits Round each value to its nearest level G = 2 m 41

Effect of quantization 42

Effect of quantization 43

Image Size Storage space Spatial resolution: N x M Quantization: m bits per pixel Required bits b: Rule of thumb: b = N M m More storage space means more image quality 44

Image Scaling This image is too big to fit on the screen. How can we reduce it? How to generate a halfsized version? 45

Sub-sampling 1/8 1/4 Throw away every other row and column to create a 1/2 size image - called image sub-sampling 46

Sub-sampling 1/2 1/4 (2x zoom) 1/8 (4x zoom) 47

Sub-Sampling with Gaussian Pre-Filtering Gaussian 1/2 G 1/4 G 1/8 48

Compare with... 1/2 1/4 (2x zoom) 1/8 (4x zoom) 49

Topic: Data structures for digital images Image sensors Sampling and quantization Data structures for digital images Histograms 50

Data Structures for Digital Images Are there other ways to represent digital images? What we see What a computer sees 51

Chain codes Chains represent the borders of objects. Coding with chain codes. Relative. Assume an initial starting point for each object. Needs segmentation! Freeman Chain Code Using a Freeman Chain Code and considering the top-left pixel of the image as the starting point: 70663422 52

Topological Data Structures Region Adjacency Graph Nodes - Regions Arcs Relationships Describes the elements of an image and their spatial relationships. Needs segmentation! Region Adjacency Graph 53

Relational Structures Stores relations between objects. Important semantic information of an image. Needs segmentation and an image description (features)! Relational Table 54

Topic: Histograms Image sensors Sampling and quantization Data structures for digital images Histograms 55

Histograms In statistics, a histogram is a graphical display of tabulated frequencies. Typically represented as a bar chart: 56

Image Histograms Colour or Intensity distribution. Typically: Reduced number of bins. Normalization. Compressed representation of an image. No spatial information whatsoever! 57

Histogram Normalization Improves the contrast in an image in order to stretch out the intensity range. The goal is to reshape the image histogram to make it flat and wide. an@ua.pt 58

Color Histogram As many histograms as axis of the color space. Ex: RGB Colour space - Red Histogram - Green Histogram - Blue Histogram Combined histogram. Red Green Blue 59

Resources J.C. Russ Chapters 2 R. Gonzalez, and R. Woods Chapter 2 60