Chapter 2: Digital Image Fundamentals. Digital image processing is based on. Mathematical and probabilistic models Human intuition and analysis

Similar documents
Digital Image Processing

DIGITAL IMAGE PROCESSING LECTURE # 4 DIGITAL IMAGE FUNDAMENTALS-I

Lecture 2 Digital Image Fundamentals. Lin ZHANG, PhD School of Software Engineering Tongji University Fall 2016

Introduction to Visual Perception & the EM Spectrum

Review. Introduction to Visual Perception & the EM Spectrum. Overview (1):

STUDY NOTES UNIT I IMAGE PERCEPTION AND SAMPLING. Elements of Digital Image Processing Systems. Elements of Visual Perception structure of human eye

Unit 1 DIGITAL IMAGE FUNDAMENTALS

CS 548: Computer Vision REVIEW: Digital Image Basics. Spring 2016 Dr. Michael J. Reale

Human Visual System. Prof. George Wolberg Dept. of Computer Science City College of New York

Image Processing - Intro. Tamás Szirányi

Digital Image Processing

Digital Image Fundamentals. Digital Image Processing. Human Visual System. Contents. Structure Of The Human Eye (cont.) Structure Of The Human Eye

Digital Image Fundamentals. Digital Image Processing. Human Visual System. Contents. Structure Of The Human Eye (cont.) Structure Of The Human Eye

Digital Image Processing

EC-433 Digital Image Processing

Human Visual System. Digital Image Processing. Digital Image Fundamentals. Structure Of The Human Eye. Blind-Spot Experiment.

Digital Image Processing

Visual Perception of Images

DIGITAL IMAGE PROCESSING (COM-3371) Week 2 - January 14, 2002

Vision. The eye. Image formation. Eye defects & corrective lenses. Visual acuity. Colour vision. Lecture 3.5

Digital Image Processing

III: Vision. Objectives:

Digital Image Fundamentals and Image Enhancement in the Spatial Domain

Image Processing (EA C443)

Image and Multidimensional Signal Processing

Fundamentals. Preview 2.1. Elements of Visual Perception. Those who wish to succeed must ask the right preliminary questions.

It allows wide range of algorithms to be applied to the input data. It avoids noise and signals distortion problems.

The Human Eye and a Camera 12.1

The human visual system

The Special Senses: Vision

CPSC 4040/6040 Computer Graphics Images. Joshua Levine

11/23/11. A few words about light nm The electromagnetic spectrum. BÓDIS Emőke 22 November Schematic structure of the eye

Vision, Color, and Illusions. Vision: How we see

AP PSYCH Unit 4.2 Vision 1. How does the eye transform light energy into neural messages? 2. How does the brain process visual information? 3.

Vision. Biological vision and image processing

10/8/ dpt. n 21 = n n' r D = The electromagnetic spectrum. A few words about light. BÓDIS Emőke 02 October Optical Imaging in the Eye

Visual perception basics. Image aquisition system. IE PŁ P. Strumiłło

SCIENCE 8 WORKBOOK Chapter 6 Human Vision Ms. Jamieson 2018 This workbook belongs to:

Acquisition and representation of images

SCIENCE 8 WORKBOOK Chapter 6 Human Vision Ms. Jamieson 2018 This workbook belongs to:

LIGHT AND LIGHTING FUNDAMENTALS. Prepared by Engr. John Paul Timola

Acquisition and representation of images

Digital Image Processing COSC 6380/4393

Graphics and Image Processing Basics

Digital Image Processing. Lecture # 8 Color Processing

Science 8 Unit 2 Pack:

EYE STRUCTURE AND FUNCTION

Chapter Six Chapter Six

Eye. Eye Major structural layer of the wall of the eye is a thick layer of dense C.T.; that layer has two parts:

EYE ANATOMY. Multimedia Health Education. Disclaimer

Capturing Light in man and machine. Some figures from Steve Seitz, Steve Palmer, Paul Debevec, and Gonzalez et al.

Human Vision, Color and Basic Image Processing

Image Processing. Michael Kazhdan ( /657) HB Ch FvDFH Ch. 13.1

Digital Image Processing COSC 6380/4393

Sensory receptors External internal stimulus change detectable energy transduce action potential different strengths different frequencies

The Human Visual System. Lecture 1. The Human Visual System. The Human Eye. The Human Retina. cones. rods. horizontal. bipolar. amacrine.

Seeing and Perception. External features of the Eye

Further reading. 1. Visual perception. Restricting the light. Forming an image. Angel, section 1.4

HW- Finish your vision book!

Reading. 1. Visual perception. Outline. Forming an image. Optional: Glassner, Principles of Digital Image Synthesis, sections

Chapter 6 Human Vision

Visual Optics. Visual Optics - Introduction

Capturing Light in man and machine

Color Image Processing. Gonzales & Woods: Chapter 6

Visual Perception. Overview. The Eye. Information Processing by Human Observer

For a long time I limited myself to one color as a form of discipline. Pablo Picasso. Color Image Processing

Vision. PSYCHOLOGY (8th Edition, in Modules) David Myers. Module 13. Vision. Vision

COLOR and the human response to light

Vision Science I Exam 1 23 September ) The plot to the right shows the spectrum of a light source. Which of the following sources is this

Digital Image Processing

Vision. By: Karen, Jaqui, and Jen

Capturing Light in man and machine

Work environment. Retina anatomy. A human eyeball is like a simple camera! The way of vision signal. Directional sensitivity. Lighting.

ECC419 IMAGE PROCESSING

The Science Seeing of process Digital Media. The Science of Digital Media Introduction

Chapter Human Vision

Physics 1230: Light and Color. Guest Lecture, Jack again. Lecture 23: More about cameras

Introduction. Chapter Aim of the Thesis

CSCE 763: Digital Image Processing

Image Acquisition, Display, and Perception

General Imaging System

Light and sight. Sight is the ability for a token to "see" its surroundings

November 14, 2017 Vision: photoreceptor cells in eye 3 grps of accessory organs 1-eyebrows, eyelids, & eyelashes 2- lacrimal apparatus:

Lecture 26. PHY 112: Light, Color and Vision. Finalities. Final: Thursday May 19, 2:15 to 4:45 pm. Prof. Clark McGrew Physics D 134

Yokohama City University lecture INTRODUCTION TO HUMAN VISION Presentation notes 7/10/14

Fig Color spectrum seen by passing white light through a prism.

IMAGE SENSOR SOLUTIONS. KAC-96-1/5" Lens Kit. KODAK KAC-96-1/5" Lens Kit. for use with the KODAK CMOS Image Sensors. November 2004 Revision 2

19. Vision and color

Color Image Processing. Jen-Chang Liu, Spring 2006

Class 10 Science NCERT Exemplar Solutions Human Eye and Colourful World

Instructional Resources/Materials: Light vocabulary cards printed (class set) Enough for each student (See card sort below)

Lecture 3: Grey and Color Image Processing

Color & Graphics. Color & Vision. The complete display system is: We'll talk about: Model Frame Buffer Screen Eye Brain

Visual System I Eye and Retina

Visual Perception. Readings and References. Forming an image. Pinhole camera. Readings. Other References. CSE 457, Autumn 2004 Computer Graphics

Vision and Color. Reading. Optics, cont d. Lenses. d d f. Brian Curless CSE 557 Autumn Good resources:

Vision and Color. Brian Curless CSE 557 Autumn 2015

Reading. Lenses, cont d. Lenses. Vision and color. d d f. Good resources: Glassner, Principles of Digital Image Synthesis, pp

CSE 527: Introduction to Computer Vision

Chapter 25: Applied Optics. PHY2054: Chapter 25

Transcription:

Chapter 2: Digital Image Fundamentals Digital image processing is based on Mathematical and probabilistic models Human intuition and analysis

2.1 Visual Perception How images are formed in the eye? Eye s physical limitation? Human visual interpretation of images?

2.1.1 Structure of human eyes 角膜 網膜 鞏膜 脈絡膜

2.1.1 Structure of human eyes Three membranes enclose the eye: Cornea ( 角膜 ) and sclera( 鞏膜 ) Cornea is a tough, transparent tissue cover the anterior surface of the eye. Sclera is a opaque membrane enclose the remainder of the optic globe. Choroid( 脈絡膜 ) A network of blood vessels for eye nutrition At its anterior extreme, it is divided into the ciliary body and iris diaphragm. The central opening (the pupil) varies in diameter from 2 to 8 mm. Retina ( 網膜 )

2.1.1 Structure of human eyes Lens is made of concentric layer of fibrous cells and is suspended by fiber that attached to the ciliary body. The lens absorbs approximately 8% of the visible light spectrum. The lens contains 60-70% water and 6% fat and protein.

2.1.1 Structure of human eyes Retina lines the insides of the wall s interior portion with two classes of receptors: Cones: (Red 65%, Green 33%,Blue 2%) 6 7 millions located primarily in the central portion of the retina Highly sensitive to color Photopicor bright-light vision Rods 75-150 millions distributed over the retinal surface. Not involved in color vision and sensitive to lowillumination. Scoptopicor dim vision

2.1.1 Structure of human eyes Receptor density is measured in degrees from the fovea (fig. 2.2). The cones are most dense in the center of retina. Density of cones in the area of fovea is 150,000 element/mm 2. The number of cones in fovea is 337,000 elements.

2.1.1 Structure of human eyes

2.1.2 Image Formation in the Eyes The distance between the center of the lens and the retina (focal length) varies from 17mm to 14mm. The shape of lens is controlled by the tension of fibers of the ciliary body. The retinal image is reflected primarily in the area of fovea. Perception = excitation of light receptors, which transform radiant energy into electrical impulses that are ultimately decoded by the brain.

2.1.2 Image Formation in the Eyes

Image Comm. Lab EE/NTHU 11 2.1.3 Brightness adaptation and discrimination The range of light intensity levels to which the human visual system can adapt is enormous on the order of 10 10. The subjective brightness is a logarithmic function of light intensity incident on the eye. In photopic vision, the range is about 10 6. Brightness adaptation. The current sensitivity level it can discriminate simultaneously is rather small compared with the total adaptation range Brightness adaptation level: the current sensitive level of the visual system.

Image Comm. Lab EE/NTHU 12 2.1.3 Brightness adaptation and discrimination The range of subjective brightness that eye can perceive when adapted to this level

Image Comm. Lab EE/NTHU 13 2.1.3 Brightness adaptation and discrimination Experiments: Apply a short duration flash at a circle to see if ΔI is bright enough

Image Comm. Lab EE/NTHU 14 2.1.3 Brightness adaptation and discrimination The ΔI c is the increment of illumination discriminable 50% of the time with the background illumination I. The quantity ΔI c /I is called the Weber ratio. The smaller ΔI c /I means that a small percentage change in intensity is discriminable good brightness discrimination If the background illumination is constant, the intensity of object is allowed to vary incrementally from never perceived to always being perceived. Typically the observer can discern a totally from one to two dozens different intensity changes. The number of gray level for digital image Contouring effect - not sufficient number of gray level.

Image Comm. Lab EE/NTHU 15 2.1.3 Brightness adaptation and discrimination Low-level illumination vision (rod cells) High-level illumination vision (cone cells) (better discrimination)

Image Comm. Lab EE/NTHU 16 2.1.3 Brightness adaptation and discrimination Perceived brightness is not a simple function of intensity, rather it is log of intensity A region s perceived brightness does not simply depend on its intensity (fig. 2.8) Simultaneous contrast.

Image Comm. Lab EE/NTHU 18 2.1.3 Brightness adaptation and discrimination

2.14 Light and the EM Spectrum Image Comm. Lab EE/NTHU 20

Image Comm. Lab EE/NTHU 21 2.1.3 Brightness adaptation and discrimination Light is a particular type of EM radiation that can be seen by human eye. Green object reflect light with wavelengths primarily in 500 to 570 nm range. Chromatic light spans EM spectrum from 0.43 μm (violet) to 0.79 μm (red) Radiance: energy in Watt Luminance: in lumens(lm) the amount of energy the observer perceives Brightness: subjective description of light perception.

Image Comm. Lab EE/NTHU 22 2.3 Image Sensing and Acquisition Image = illumination + scene A visible light source illuminates a 3-D scene. Illumination originate from Conventional EM source, infrared, X-ray, Ultrasound. Computer-generated illumination pattern

Image Comm. Lab EE/NTHU 23 Chapter 2: Digital Image Fundamentals

2.3.1 A single sensor Image Comm. Lab EE/NTHU 24

2.3.1 A sensor strip Image Comm. Lab EE/NTHU 25

2.3.3 A sensor array Image Comm. Lab EE/NTHU 26

Image Comm. Lab EE/NTHU 27 2.3.4 Image formation model For monochromatic image 2-D array: f(x, y) The f(x, y) is characterized by two components: The amount of source illumination incident on the scene, i.e., i(x,y). The amount of illumination reflected by the objects in the scene, i.e., reflectivity r(x, y). f(x, y)=i(x, y) r(x, y) where 0 <i(x, y)< and 0<r(x, y)<1 Reflectivity function: r(x, y) For X-ray, transmissivity function The intensity of monochrome image is L min f(x,y) L max L min =i min r min and L max =i max r max Indoor: L min =10 and L max =1000

Image Comm. Lab EE/NTHU 28 2.4 Image Sampling and Quantization To acquire digital image from the continuous sensed data f(x, y): Digitization in coordinate values: Sampling Digitization in amplitude values: Quantization.

Image Comm. Lab EE/NTHU 29 2.4 Image Sampling and Quantization

Image Comm. Lab EE/NTHU 30 2.4 Image Sampling and Quantization

Image Comm. Lab EE/NTHU 31 2.4.2 Representing Digital Images The resulting image is a 2-D array with M rows and N columns. f ( x, y) = f f (0,0) f (1,0) ( M 1,0) f f (0,1) f (1,1) ( M 1,1) f f (0, N 1) f (1, N 1) ( M 1, N 1) Each element of this matrix is called an image element, picture element, pixel, or pel.

2.4.2 Representing Digital Images Image Comm. Lab EE/NTHU 32

Image Comm. Lab EE/NTHU 33 2.4 Spatial and Gray-Level resolution The digitization process requires to determine the M, N, and L M and N spatial resolution L gray-level resolution L=2 k. L=gray-level The number of bits required to store the image b=m N k or b= N 2 k

Image Comm. Lab EE/NTHU 34 2.4 Image Sampling and Quantization

Image Comm. Lab EE/NTHU 35 2.4.3 Spatial and Gray-Level resolution Sampling Spatial resolution Quantization Gray-level resolution Spatial resolution:: No. of points where CCD are placed to read light reflection Gray-level resolution:: No. of bits/bytes reserved for one pixel.

Image Comm. Lab EE/NTHU 36 2.4.3 Spatial and Gray-Level resolution

Image Comm. Lab EE/NTHU 37 2.4.3 Spatial and Gray-Level resolution

Image Comm. Lab EE/NTHU 38 2.4.3 Spatial and Gray-Level resolution

Image Comm. Lab EE/NTHU 39 2.4.3 Spatial and Gray-Level resolution

Image Comm. Lab EE/NTHU 40 2.4.3 Spatial and Gray-Level resolution Contouring defect

Image Comm. Lab EE/NTHU 44 2.4.3 Aliasing and Moire Pattern Band-limited function. Undersampling aliasing. Aliasing frequencies Sampling rate : the number of samples taken per unit distance Reduce high frequency component prior to sampling. Moire Pattern is caused by a break-up of the periodicity, i.e., images are scanned from a printed page, which consists of periodic ink dots.

Image Comm. Lab EE/NTHU 45 2.4.4 Aliasing and Moire Pattern

Image Comm. Lab EE/NTHU 47 2.4.5 Zooming and Shrinking Zooming: Create a new pixel locations Assign a gray-levels to those new locations Nearest neighbor interpolation Pixel replication Bilinear interpolation using four nearest neighbors v(x, y )=ax +by +cx y +d where a, b, c, and d are obtained from the gray-level of the four neighbors. Higher-order non-linear interpolation: using more neighbors for interpolation Shrinking: Direct shrinking causes aliasing Expansion then Shrinking: blurring the image before shrinking it and reduce aliasing.

2.4.5 Zooming and Shrinking Image Comm. Lab EE/NTHU 49

Image Comm. Lab EE/NTHU 50 2.5 Basic Relations between pixels Neighbors of a pixel p Horizontal and vertical neighbors. (x+1, y), (x-1, y), (x, y+1), (x, y-1) Four diagonal neighbors. (x+1, y+1), (x+1, y-1), (x-1, y+1), (x-1, y-1) 4-neighbors of p: N 4 (p). 4-diagonal neighbors of p : N D (p). 8-neighbors of p: N 8 (p)= N 4 (p) N D (p).

Image Comm. Lab EE/NTHU 51 2.5 Basic Relations between pixels Adjacency 4-adjacency: p and q are 4-adjacency if q N 4 (p) 8-adjacency: p and q are 8-adjacency if q N 8 (p) m-adjacent (mixed) (i.e., Fig. 2.26) Path (curve) from p=(x 0, y 0 ) to g=(x n, y n ) consist of a sequence of pixels: (x 0, y 0 ), (x 1, y 1 ),. (x n, y n ) where pixels (x i, y i ) and (x i-1, y i-1 ) are adjacent Closed path if (x 0, y 0 )=(x n, y n )

Image Comm. Lab EE/NTHU 52 2.5 Basic Relations between pixels Connectivity S represent a set of pixels in image, Two pixels p and q are said to connected in S if there exists a path between them. For any pixel p in S, the set of pixels that are connected to it in S is called a connected component in S. If there is only one connected component, then S is called a connected set Regions. Let R be a subset of pixels in image, We call R a region if it is a connected set. Boundary: The set of pixels in a region R that have one or more neighbors that are not in R.

Image Comm. Lab EE/NTHU 53 2.5 Basic Relations between pixels q N 4 (p) q N 4 (q) Pixel p q N D (p)

Image Comm. Lab EE/NTHU 54 2.5 Basic Relations between pixels Distance measures Euclidean distance City-block distance or D 4 distance. D 4 (p, q)= x-s + y-t 2 2 1 2 2 2 2 1 0 1 2 2 2 1 2 2 2 2 D 8 distance or chessboard distance. D 8 (p, q)= max ( x-s, y-t ) 2 2 2 1 1 1 1 0 1 1 1 1 2 2 2 2 2 2 2 2

Image Comm. Lab EE/NTHU 55 2.5.4 Image Operation on a Pixel basis Image processing: different operations applied on the pixels. f (x, y) H( ) f (x, y) Linear or nonlinear operation H(af+bg)=aH(f)+bH(g), H is a linear operator.