Module 3: Video Sampling Lecture 18: Filtering operations in Camera and display devices. The Lecture Contains: Effect of Temporal Aperture:

Similar documents
The Science Seeing of process Digital Media. The Science of Digital Media Introduction

What will be on the midterm?

RGB RESOLUTION CONSIDERATIONS IN A NEW CMOS SENSOR FOR CINE MOTION IMAGING

MISB RP RECOMMENDED PRACTICE. 25 June H.264 Bandwidth/Quality/Latency Tradeoffs. 1 Scope. 2 Informative References.

ELEC Dr Reji Mathew Electrical Engineering UNSW

CPSC 4040/6040 Computer Graphics Images. Joshua Levine

Sampling and pixels. CS 178, Spring Marc Levoy Computer Science Department Stanford University. Begun 4/23, finished 4/25.

Introduction. Chapter 16 Diagnostic Radiology. Primary radiological image. Primary radiological image

8.2 IMAGE PROCESSING VERSUS IMAGE ANALYSIS Image processing: The collection of routines and

Image Formation and Capture. Acknowledgment: some figures by B. Curless, E. Hecht, W.J. Smith, B.K.P. Horn, and A. Theuwissen

SUPER RESOLUTION INTRODUCTION

Last Lecture. photomatix.com

Lecture Notes 11 Introduction to Color Imaging

Examination, TEN1, in courses SK2500/SK2501, Physics of Biomedical Microscopy,

FEATURE. Adaptive Temporal Aperture Control for Improving Motion Image Quality of OLED Display

TSBB09 Image Sensors 2018-HT2. Image Formation Part 1

Sampling Efficiency in Digital Camera Performance Standards

Laser Beam Analysis Using Image Processing

Antialiasing and Related Issues

Introduction. Prof. Lina Karam School of Electrical, Computer, & Energy Engineering Arizona State University

Attenuation length in strip scintillators. Jonathan Button, William McGrew, Y.-W. Lui, D. H. Youngblood

Image acquisition. In both cases, the digital sensing element is one of the following: Line array Area array. Single sensor

Implementation of Adaptive Coded Aperture Imaging using a Digital Micro-Mirror Device for Defocus Deblurring

Chapter 2 Fourier Integral Representation of an Optical Image

White paper. Low Light Level Image Processing Technology

OFFSET AND NOISE COMPENSATION

digital film technology Resolution Matters what's in a pattern white paper standing the test of time

MTF Analysis and its Measurements for Digital Still Camera

LENSES. INEL 6088 Computer Vision

CS 775: Advanced Computer Graphics. Lecture 12 : Antialiasing

Εισαγωγική στην Οπτική Απεικόνιση

ISO INTERNATIONAL STANDARD. Photography Electronic still-picture cameras Resolution measurements

General Imaging System

Lecture 2: Digital Image Fundamentals -- Sampling & Quantization

Filters. Materials from Prof. Klaus Mueller

Image Processing for feature extraction

Megapixels and more. The basics of image processing in digital cameras. Construction of a digital camera

X-ray generation by femtosecond laser pulses and its application to soft X-ray imaging microscope

Introduction to Computer Vision

Defense Technical Information Center Compilation Part Notice

ELECTRONIC HOLOGRAPHY

Final Exam Solutions June 7, 2004

Evaluating Commercial Scanners for Astronomical Images. The underlying technology of the scanners: Pixel sizes:

DIGITAL IMAGE PROCESSING UNIT III

Better Imaging with a Schmidt-Czerny-Turner Spectrograph

Dynamically Reparameterized Light Fields & Fourier Slice Photography. Oliver Barth, 2009 Max Planck Institute Saarbrücken

DIGITAL IMAGING. Handbook of. Wiley VOL 1: IMAGE CAPTURE AND STORAGE. Editor-in- Chief

Measurement of Visual Resolution of Display Screens

CoE4TN4 Image Processing. Chapter 4 Filtering in the Frequency Domain

Visual Perception of Images

Invited paper at. to be published in the proceedings of the workshop. Electronic image sensors vs. film: beyond state-of-the-art

Transforms and Frequency Filtering

Image Enhancement. DD2423 Image Analysis and Computer Vision. Computational Vision and Active Perception School of Computer Science and Communication

Digital Camera Technologies for Scientific Bio-Imaging. Part 2: Sampling and Signal

!"!#"#$% Lecture 2: Media Creation. Some materials taken from Prof. Yao Wang s slides RECAP

On spatial resolution

Image Formation and Capture

8. Lecture. Image restoration: Fourier domain

Last Lecture. photomatix.com

Overview. Pinhole camera model Projective geometry Vanishing points and lines Projection matrix Cameras with Lenses Color Digital image

Observational Astronomy

Paper or poster submitted for Europto-SPIE / AFPAEC May Zurich, CH. Version 9-Apr-98 Printed on 05/15/98 3:49 PM

Coding and Modulation in Cameras

Sensing Increased Image Resolution Using Aperture Masks

Cameras CS / ECE 181B

Lecture 20: Optical Tools for MEMS Imaging

Solution Set #2

Sensors and Sensing Cameras and Camera Calibration

Phase-sensitive high-speed THz imaging

Realistic Image Synthesis

FIRST INDIRECT X-RAY IMAGING TESTS WITH AN 88-mm DIAMETER SINGLE CRYSTAL

Smoothing frequency domain filters

Table of contents. Vision industrielle 2002/2003. Local and semi-local smoothing. Linear noise filtering: example. Convolution: introduction

Exercise questions for Machine vision

Until recently, varispeed acquisition was used for

Lecture Notes 10 Image Sensor Optics. Imaging optics. Pixel optics. Microlens

Photographing Long Scenes with Multiviewpoint

Special Imaging Techniques

Image acquisition. Midterm Review. Digitization, line of image. Digitization, whole image. Geometric transformations. Interpolation 10/26/2016

Holography. Casey Soileau Physics 173 Professor David Kleinfeld UCSD Spring 2011 June 9 th, 2011

Deconvolution , , Computational Photography Fall 2018, Lecture 12

Double resolution from a set of aliased images

Optical Performance of Nikon F-Mount Lenses. Landon Carter May 11, Measurement and Instrumentation

Chapter 2: Digital Image Fundamentals. Digital image processing is based on. Mathematical and probabilistic models Human intuition and analysis

Basic principles of photography. David Capel 346B IST

Sampling of Continuous-Time Signals. Reference chapter 4 in Oppenheim and Schafer.

Digital camera. Sensor. Memory card. Circuit board

Digital Image Processing

Determining MTF with a Slant Edge Target ABSTRACT AND INTRODUCTION

Introduction. Related Work

6.A44 Computational Photography

a) How big will that physical image of the cells be your camera sensor?

The Unique Role of Lucis Differential Hysteresis Processing (DHP) in Digital Image Enhancement

ROBOT VISION. Dr.M.Madhavi, MED, MVSREC

Digital Imaging Systems for Historical Documents

Visual perception basics. Image aquisition system. IE PŁ P. Strumiłło

Subband coring for image noise reduction. Edward H. Adelson Internal Report, RCA David Sarnoff Research Center, Nov

Digital Imaging with the Nikon D1X and D100 cameras. A tutorial with Simon Stafford

Sampling Theory. CS5625 Lecture Steve Marschner. Cornell CS5625 Spring 2016 Lecture 7

Frequency Domain Median-like Filter for Periodic and Quasi-Periodic Noise Removal

Transcription:

The Lecture Contains: Effect of Temporal Aperture: Spatial Aperture: Effect of Display Aperture: file:///d /...e%20(ganesh%20rana)/my%20course_ganesh%20rana/prof.%20sumana%20gupta/final%20dvsp/lecture18/18_1.htm[12/30/2015 4:17:04 PM]

Effect of Temporal Aperture: A video camera typically accomplishes a certain degree of prefiltering in the capturing process. The intensity values read out at any frame instant are not the sensed intensity values at that time. In fact they are the averages of the sensed signal over a certain duration, referred to as the exposure time. Consequently, the camera is said to be applying a pre-filter in the temporal domain with an impulse response given by, The frequency response of this filter is, The above function reaches zero at. As is the temporal sampling rate, the ideal prefilter for this purpose will be a low-pass filter with cut off frequency at. By choosing, the camera can suppress the temporal aliasing components, near the sampling rate. However, too large a will blur the signal. In practice, the effect of blurring is sometimes more visible than aliasing. As a result, the exposure time must be chosen to reach a proper trade-off between aliasing and blurring effects. In addition to temporal integration discussed above, the camera also performs spatial integration. file:///d /...e%20(ganesh%20rana)/my%20course_ganesh%20rana/prof.%20sumana%20gupta/final%20dvsp/lecture18/18_2.htm[12/30/2015 4:17:04 PM]

Spatial Aperture: The value of any pixel obtains from a sensor in a CCD camera is not the optical signal at that point alone. It is a weighted integration of the signals in a small window surrounding it, called the aperture of the camera. The shape of the aperture and the weighting values constitutes the cameras spatial aperture function. The aperture function serves as the spatial prefilter and its Fourier transform is referred to as the modulation transfer function (MTF) of the camera. For most cameras, we can approximate the spatial aperture function by a circularly symmetric Gaussian function given by, The spectrum of this function is also Gaussian given by, where The value of or depends on the size and shape of the aperture. They are usually designed such that the frequency response is 0.5 at half the vertical and horizontal sampling rates. Assuming we obtain file:///d /...e%20(ganesh%20rana)/my%20course_ganesh%20rana/prof.%20sumana%20gupta/final%20dvsp/lecture18/18_3.htm[12/30/2015 4:17:04 PM]

The overall camera aperture function or prefilter is, with a frequency response given by The impulse response of a camera with and is shown below in figure below. The frequency response is shown for plane at. The frequency response is far from the ideal half band lowpass filter which should have a square pass-band defined by and observing the frequency response, we note that the frequency components inside the designed passband (the Vornoi cell) are attenuated, thereby reducing the signal resolution unnecessarily. In addition, the frequency components in the desired stop band are not completely removed. This results in aliasing in the sampled signal. It has been found that viewers are more annoyed by loss of resolution than aliasing artifacts. This is so because aliasing artifacts are visually noticeable only for images that contain high frequency periodic patterns close to the lowest aliasing frequency. This is rare in natural scene images. It is more preferable to preserve the signal in the passband than its suppression outside this band. Digital filters are often used for more peruse prefiltering. This involves sampling the signal at a rate higher than the desired sampling rate followed by a digital filter to suppress the frequencies outside the desired passband and a down sampler to convert this processed digital signal to the desired sampling rate. Although the use of sharp transition digital filters give better performance in terms of mean square error, it may give rise to Gibb's effect at sharp edges and also require higher order filters for its implementation. This is a serious drawback in video applications. file:///d /...e%20(ganesh%20rana)/my%20course_ganesh%20rana/prof.%20sumana%20gupta/final%20dvsp/lecture18/18_4.htm[12/30/2015 4:17:05 PM]

Effect of Display Aperture: In a CRT monitor, an electronic gun emits an electronic beam across the screen line by line, striking phosphors with intensities proportional to the intensity of the video signal at corresponding locations. For display of color signal, three beams are emitted by three guns, striking the red, green and blue phosphors with the desired intensity combination at each location. The striking beam thickness plays an important role. It determines the amount of vertical filtering. If the beam is thin, it will make the image look sharp but also cause the perception of scan lines when the observer sits too close to the screen. On the other hand a thick beam will blur the image. Normally to reduce the loss of spatial resolution, thin beams are used so that very little vertical filtering is carried out by the display device. Temporal filtering is determined by the phosphors. The P22 phosphors used in color TVS decays to less than 10% of the peak value in 10ms to 1ms. This is much smaller than field time which is 16-7ms. Thus practically no temporal filtering is performed. Based on the spatio-temporal frequency response of the HVS, the eye performs to some degree the required interpolation. file:///d /...e%20(ganesh%20rana)/my%20course_ganesh%20rana/prof.%20sumana%20gupta/final%20dvsp/lecture18/18_5.htm[12/30/2015 4:17:05 PM]