A Unified Framework for the Consumer-Grade Image Pipeline

Similar documents
THE commercial proliferation of single-sensor digital cameras

Normalized Color-Ratio Modeling for CFA Interpolation

Image acquisition. In both cases, the digital sensing element is one of the following: Line array Area array. Single sensor

Image Demosaicing. Chapter Introduction. Ruiwen Zhen and Robert L. Stevenson

Demosaicing Algorithms

A robust, cost-effective post-processor for enhancing demosaicked camera images

Demosaicing Algorithm for Color Filter Arrays Based on SVMs

Digital Image Indexing Using Secret Sharing Schemes: A Unified Framework for Single-Sensor Consumer Electronics

A Cost-Effective Private-Key Cryptosystem for Color Image Encryption

COLOR DEMOSAICING USING MULTI-FRAME SUPER-RESOLUTION

For a long time I limited myself to one color as a form of discipline. Pablo Picasso. Color Image Processing

A new CFA interpolation framework

Color Filter Array Interpolation Using Adaptive Filter

RGB RESOLUTION CONSIDERATIONS IN A NEW CMOS SENSOR FOR CINE MOTION IMAGING

Image and Video Processing

Lecture Notes 11 Introduction to Color Imaging

TRUESENSE SPARSE COLOR FILTER PATTERN OVERVIEW SEPTEMBER 30, 2013 APPLICATION NOTE REVISION 1.0

Cameras. Outline. Pinhole camera. Camera trial #1. Pinhole camera Film camera Digital camera Video camera

Fig Color spectrum seen by passing white light through a prism.

Digital Cameras The Imaging Capture Path

A New Image Sharpening Approach for Single-Sensor Digital Cameras

Interpolation of CFA Color Images with Hybrid Image Denoising

Color Image Processing EEE 6209 Digital Image Processing. Outline

Design and Simulation of Optimized Color Interpolation Processor for Image and Video Application

Cameras. Shrinking the aperture. Camera trial #1. Pinhole camera. Digital Visual Effects Yung-Yu Chuang. Put a piece of film in front of an object.

Improvements of Demosaicking and Compression for Single Sensor Digital Cameras

Analysis on Color Filter Array Image Compression Methods

High Dynamic Range image capturing by Spatial Varying Exposed Color Filter Array with specific Demosaicking Algorithm

Figures from Embedded System Design: A Unified Hardware/Software Introduction, Frank Vahid and Tony Givargis, New York, John Wiley, 2002

How does prism technology help to achieve superior color image quality?

Denoising and Demosaicking of Color Images

Introduction to Computer Vision

A Novel Method for Enhancing Satellite & Land Survey Images Using Color Filter Array Interpolation Technique (CFA)

IMPROVEMENTS ON SOURCE CAMERA-MODEL IDENTIFICATION BASED ON CFA INTERPOLATION

DIGITAL IMAGE PROCESSING UNIT III

Cvision 2. António J. R. Neves João Paulo Silva Cunha. Bernardo Cunha. IEETA / Universidade de Aveiro

Cameras CS / ECE 181B

ABSTRACT I. INTRODUCTION. Kr. Nain Yadav M.Tech Scholar, Department of Computer Science, NVPEMI, Kanpur, Uttar Pradesh, India

Cameras. Digital Visual Effects, Spring 2008 Yung-Yu Chuang 2008/2/26. with slides by Fredo Durand, Brian Curless, Steve Seitz and Alexei Efros

Two-Pass Color Interpolation for Color Filter Array

ME 6406 MACHINE VISION. Georgia Institute of Technology

Putting It All Together: Computer Architecture and the Digital Camera

Multi-sensor Super-Resolution

Image Formation and Capture. Acknowledgment: some figures by B. Curless, E. Hecht, W.J. Smith, B.K.P. Horn, and A. Theuwissen

Practical Implementation of LMMSE Demosaicing Using Luminance and Chrominance Spaces.

Announcement A total of 5 (five) late days are allowed for projects. Office hours

CoE4TN4 Image Processing. Chapter 3: Intensity Transformation and Spatial Filtering

A simulation tool for evaluating digital camera image quality

AN EFFECTIVE APPROACH FOR IMAGE RECONSTRUCTION AND REFINING USING DEMOSAICING

COLOR FILTER PATTERNS

Lecture 2: Digital Image Fundamentals -- Sampling & Quantization

IDENTIFYING DIGITAL CAMERAS USING CFA INTERPOLATION

EE482: Digital Signal Processing Applications

COLOR demosaicking of charge-coupled device (CCD)

Sampling Rate = Resolution Quantization Level = Color Depth = Bit Depth = Number of Colors

ELEC Dr Reji Mathew Electrical Engineering UNSW

IEEE TRANSACTIONS ON IMAGE PROCESSING, VOL. 15, NO. 1, JANUARY Sina Farsiu, Michael Elad, and Peyman Milanfar, Senior Member, IEEE

Acquisition. Some slides from: Yung-Yu Chuang (DigiVfx) Jan Neumann, Pat Hanrahan, Alexei Efros

Image Processing for feature extraction

Research Article Discrete Wavelet Transform on Color Picture Interpolation of Digital Still Camera

Color Restoration of RGBN Multispectral Filter Array Sensor Images Based on Spectral Decomposition

brief history of photography foveon X3 imager technology description

Camera Image Processing Pipeline

Digital Photographs and Matrices

Comparative Study of Demosaicing Algorithms for Bayer and Pseudo-Random Bayer Color Filter Arrays

Recent Patents on Color Demosaicing

Watermark Embedding in Digital Camera Firmware. Peter Meerwald, May 28, 2008

Texture Sensitive Denoising for Single Sensor Color Imaging Devices

Image Processing Computer Graphics I Lecture 20. Display Color Models Filters Dithering Image Compression

Image Processing: An Overview

SUPER RESOLUTION INTRODUCTION

Cameras. Outline. Pinhole camera. Camera trial #1. Pinhole camera Film camera Digital camera Video camera High dynamic range imaging

6 Color Image Processing

ity Multimedia Forensics and Security through Provenance Inference Chang-Tsun Li

ENEE408G Multimedia Signal Processing

Digital photography , , Computational Photography Fall 2017, Lecture 2

Image Perception & 2D Images

Lecture 19: Depth Cameras. Kayvon Fatahalian CMU : Graphics and Imaging Architectures (Fall 2011)

Acquisition Basics. How can we measure material properties? Goal of this Section. Special Purpose Tools. General Purpose Tools

Color Constancy Using Standard Deviation of Color Channels

Effective Pixel Interpolation for Image Super Resolution

Color images C1 C2 C3

Simultaneous Capturing of RGB and Additional Band Images Using Hybrid Color Filter Array

VISUAL sensor technologies have experienced tremendous

Demosaicing and Denoising on Simulated Light Field Images

University Of Lübeck ISNM Presented by: Omar A. Hanoun

Homogeneous Representation Representation of points & vectors. Properties. Homogeneous Transformations

Image Processing by Bilateral Filtering Method

Introduction. Prof. Lina Karam School of Electrical, Computer, & Energy Engineering Arizona State University

A Color Filter Array Based Multispectral Camera

Spatially Varying Color Correction Matrices for Reduced Noise

CS6640 Computational Photography. 6. Color science for digital photography Steve Marschner

What will be on the final exam?

Smart Interpolation by Anisotropic Diffusion

Optical Flow Estimation. Using High Frame Rate Sequences

Edge Potency Filter Based Color Filter Array Interruption

CPSC 4040/6040 Computer Graphics Images. Joshua Levine

Compression and Image Formats

A Spatial Mean and Median Filter For Noise Removal in Digital Images

Image Processing for Mechatronics Engineering For senior undergraduate students Academic Year 2017/2018, Winter Semester

Transcription:

A Unified Framework for the Consumer-Grade Image Pipeline Konstantinos N. Plataniotis University of Toronto kostas@dsp.utoronto.ca www.dsp.utoronto.ca Common work with Rastislav Lukac Outline The problem Background Single Sensor Imaging: Challenges & Opportunities Performance issues Conclusions

Digital color imaging color Parrots R channel G channel B channel column k 2 spatial position i = ( k1 1) K1+ k2 sample i = ( 186,48, 42) i1 = 186 i2 = 48 i3 = 42 row k 1 K 1 (number of rows) l = 2 K ( dimension) 2 RGB R G B (number of columns) RG RB GB m = 3 Color acquisition: (number of color channels) digital cameras - most popular and widely used scanners synthetic (e.g. gray-scale coloration) Focusing on the color piel level Red Magenta White Blue Cyan Yellow Green RGB (srgb) color space: commonly used for acquisition, storage, and displaying purposes additive concept of color composition 1 Red i j Magenta Yellow White RGB color piel is the vector in a threedimensional (RGB) color space Black 0 1 Blue vector components are the intensities measured in RGB color channels 1 Green Cyan Mawell triangle

Color imaging basics Color vector: uniquely characterized by its G magnitude (length) direction (orientation) ( ) i 2 2 i i2 i1 ( ) R unit sphere 0 2 i 3 2 i 1 ( ) i3 M = = ( ) + ( ) + ( ) D 2 2 2 i i i1 i2 i3,, ; D 1 i1 i2 i3 = i = i M M M i i i i i 2 D i M i R 1 0 i 1 M i i 3 M i B G 1 1 B Camera: End-user s point of view Focus on effectiveness: functionality vs. cost optics (optical zoom), digital zoom, memory, battery, etc. multimedia acquisition, processing & transmission (, audio and video)

Three-sensor imaging Sensor : a monochromatic device; most epensive component of the digital camera (10% to 25% of the total cost) charge-coupled device (CCD) complementary metal oide semiconductor (CMOS) sensor R filter + sensor scene professional designs each sensor corresponds to a particular color channel spectrally selective filters optical G filter system + + sensor B filter + sensor color filters + sensors (CCD/CMOS) sensor data epensive solution sensor data arranged as RGB color data camera output X3 technology-based single-sensor imaging Layered (three-layer) silicon sensor new technology - epensive solution for professional devices (medical & science applications) directly captures RGB light at each spatial location in an during a single eposure takes advantage of the natural light absorbing characteristics of silicon color filters are stacked vertically and ordered according to the energy of the photons absorbed by silicon

Single-sensor imaging scene optical system CFA + sensor color filter array + sensor (CCD/CMOS) Color filter array (CFA) sensor data sensor data arranged as RGB color data demosaicking generates a 2-D array or mosaic of color components produced CFA (sensor) is a gray-scale full-color is obtained through digital processing camera output Color filter array (CFA) design Key factors in CFA design immunity to color artifacts and color moiré cost-effective reconstruction reaction of the pattern to sensor imperfections immunity to optical/electrical cross talk between neighboring piels Color systems used in CFA design i) tri-stimulus (RGB, YMC) systems - RGB is most widely used ii) mied primary/complementary colors (e.g. MGCY pattern) iii) four and more color systems (white and/or colors with shifted spectral sensitivity) - CFAs in ii) and iii) may produce more accurate hue gamut, but they limit the useful range of the darker colors

Common RGB-based CFAs Bayer CFA Diagonal stripe CFA Vertical stripe CFA Yamanaka CFA Diagonal Bayer CFA Pseudo-random CFA Pseudo-random CFA HVS based design Bayer CFA is widely used (good performance, cost-effective color reconstruction) Single-sensor camera architecture lens, zoom, focus aperture and shutter viewfinder infrared blocking, anti-aliasing optical filter scene optical system blocking system (Bayer) CFA sensor (CCD,CMOS) flash A/D converter stick memory media (card) user interface microprocessor bus camera ASIC color LCD display power supply (battery, AC) DRAM buffer firmware memory PC / TV interface (USB, A/V) DRAM buffer temporally stores the digital data from the A/D converter DRAM then passes data to the application-specific integrated circuit (ASIC) digital data processing, such as demosaicking and resizing, is realized in both ASIC and microprocessor

Camera processing Processing demosaicking (spectral interpolation) demosaicked postprocessing (color enhancement) camera zooming (spatial interpolation in CFA or full-color domain) Compression lossy (or near lossless) vs. lossless compression CFA compression vs. demosaicked compression Data management camera (CFA) indeing connection to retrieval Implementation Conventional digital camera real-time constraints (computational simplicity requirements) CFA data camera processing storage digital camera Using a companion personal computer (PC) PC interfaces with the digital camera which stores the s in the raw CFA format allows for the utilization of sophisticated solutions CFA data storage camera processing storage digital camera personal computer (PC)

Camera processing operations Considering the spectral characteristics component-wise (marginal) processing (component component) spectral model-based processing (vector component) vector processing (vector vector) Considering the content (structure) non-adaptive processing (data) adaptive processing Practical solutions spectral model used to eliminate color shifts and artifacts edge-sensing mechanism used to eliminate edge-blurring and to produce sharply-looking fine details input camera Edge-sensing mechanism estimation operations Spectral model generalized camera solutions outputted camera Considering the spectral characteristics Component-wise processing each color plane processed separately omission of the spectral information results in color shifts and artifacts input color camera processing camera processing camera processing output color Spectral model based processing essential spectral information utilized during processing computationally very efficient - most widely used in camera processing input color camera processing camera processing camera processing output color Vector processing piels are processed as vectors computationally epensive input color camera processing output color

Considering the content Non-adaptive processing no data-adaptive control often reduces to linear processing - easy to implement performance limitations ( blurring) processing no parameters or fied setting y Adaptive processing edge-sensing weights used to follow structural content nonlinear processing enhanced performance, sharply looking s processing parameters adaptation y Data-adaptive processing Construction = { w Ψ(, )} ( pq, ) ( i, j) ( i, j) ( pq, ) (, i j) ζ w = w / w using spatial, structural, and spectral characteristics (, i j) (, i j) (, i j) (, i j) ζ Spatial characteristics local neighborhood area ζ Structural characteristics edge-sensing mechanism λ Spectral characteristics spectral model Ψ input camera Edge-sensing mechanism estimation operations λ( z) { w,( i, j) ζ} (, i j) Spectral model generalized camera solutions z denotes the CFA outputted camera

Local neighborhood area Features approimation using a shape mask ζ shape and size of ζ vary depending on the CFA used and processing task (demosaicking, resizing, etc.) shape masks widely used in the demosaicking process: + + + + + (a) (b) (c) (d) (e) (a,d,e) ζ = {( p 1, q),( pq, 1),( pq, + 1),( p+ 1, q)} (b,c) ζ = {( p 1, q 1), ( p 1, q+ 1), ( p+ 1, q 1), ( p+ 1, q+ 1)} Edge-sensing mechanism (ESM) Essential to produce sharply looking s structural constraints impossed on the camera solution relate to the form of the ESM operator λ used to generate the edge-sensing weights λ( z) { w,( i, j) ζ} Concept ESM operator λ uses some form of inverse gradient of the samples in the CFA 1 w(, i j) = 1 + f ( ) (, i j) (, i j) both structural and spatial characteristics are considered in the ESM construction large gradients usually indicate that the corresponding vectors are located across edges (penalized through small weights)

Edge-sensing mechanism (ESM) Conventional designs: operate on large (55,77) neighbourhood specialization on a particular CFA (e.g. Bayer CFA): for shape mask ζ = {( p 1, q),( p, q 1),( p, q+ 1),( p+ 1, q)} w( p-1, q) = 1/(1 + z( p-2, q) z( p, q) + z( p-1, q) z ( p + 1, q) ) w( pq, 1) = 1/(1 + z( pq, 2) z( pq, ) + z( pq, 1) z( pq, + 1) ) w( pq, + 1) = 1/(1 + z( pq, + 2) z( pq, ) + z( pq, + 1) z( pq, 1) ) w = 1/(1 + z z + z z ) ( p+ 1, q) ( p+ 2, q) ( p, q) ( p+ 1, q) ( p 1, q) for shape mask ζ = {( p 1, q 1), ( p 1, q+ 1), ( p+ 1, q 1), ( p+ 1, q+ 1)} w = 1/(1 + z z + z z ) ( p-1, q 1) ( p-2, q 2) ( p, q) ( p-1, q 1) ( p+ 1, q+ 1) w( p-1, q+ 1) = 1/(1 + z( p-2, q+ 2) z( p, q) + z( p-1, q+ 1) z( p+ 1, q 1) ) w( p+ 1, q 1) = 1/(1 + z( p+ 2, q 2) z( p, q) + z( p+ 1, q 1) z( p 1, q+ 1) ) w = 1/(1 + z z + z z ) ( p+ 1, q+ 1) ( p+ 2, q+ 2) ( p, q) ( p+ 1, q+ 1) ( p 1, q 1) Edge-sensing mechanism (ESM) Cost-effective, universal design operates within the shape mask ζ aggregation concept defined here over the four-neighborhoods only desing suitable for any eisting CFA Fully automated solution w = 1/(1 + ) (, i j) (, i j) k ( g, h) k ( g, h) ς CFA data demosaicking storage digital camera End-user control based solution CFA data w = β (1 + ep{ }) r (, i j) (, i j) k ( g, h) k ( g, h) ς matches better the HVS properties storage digital camera demosaicking parameters setting personal computer (PC) visual inspection or storage

Spectral model (SM) considers spectral & spatial characteristics of neighboring color piels piel occupying location to be interpolated piel occupying neighboring location Modelling assumption in the eisting SMs: color ratio model (uniform hue modelling assumption) / = / ; k = 1 or k = 3 ( pq, ) k ( i, jk ) ( pq, )2 ( i, j)2 ( pq, ) k ( i, j) k ( pq, )2 ( i, j)2 = [,, ] = [,, ] ( pq, ) ( pq, )1 ( pq, )2 ( pq, )3 ( i, j) ( i, j)1 ( i, j)2 ( i, j)3 normalized color ratio model (hue constancy is enforced in both in edge transitions and uniform areas) ( + γ )/( + γ) = ( + γ)/( + γ) color difference model (constrained component-wise magnitude difference) = ( p, q) k ( i, j) k ( p, q)2 ( i, j)2 Vector SM Modelling assumption two neighboring vectors should have identical color chromaticity properties (directional characteristics) two spatially neighboring vectors should be collinear in the RGB (vector) color space Computational approach ( ). = cos, ( pq, ) ( i, j) ( pq, ) ( i, j) ( pq, ) ( i, j) ( pq, ) ( i, j) 3 k = 1 ( pqk, ) ( i, j) k, = 0 = 1 3 2 3 2 1 ( pqk, ) k= k= 1 ( i, j) k any color component can be determined from the epression above by solving the quadratic equation epression 2 ay + by + c = 0 y denotes the component to be determined, e.g. y = ( p, q)2

Vector SM Unique quadratic equation solution b y = y1 = y2 = due to zero discriminant 2a Geometric interpretation from two-component vector epression G (, i j ) k chromaticity line interpolated component available components ( p, q) k (, i j) (, i j )2 ( p, q) ( p, q)2 R (or B) b 2 4ac = 0 for G component ( pq, )2= ( pqk, ) ( i, j)2 (, i j) k for R or B component ( pqk, ) = ( pq, )2 ( i, j) k (, i j)2 Vector SM Geometric interpretation b from three-component vector epression y = y1 = y2 = 2a R for G component + ( p, q)1 (, i j)1 (, i j)2 ( p, q)3 (, i j)2 (, i j)3 ( pq, )2 = 2 2 (, i j)1 + (, i j)3 G ( pq, ) (, i j) B for R component + ( p, q)2 (, i j)1 (, i j)2 ( p, q)3 (, i j)1 (, i j)3 ( pq, )1= 2 2 (, i j)2 + (, i j)3 for B component + ( p, q)1 (, i j)1 (, i j)3 ( p, q)2 (, i j)2 (, i j)3 ( pq, )3 = 2 2 (, i j)1 + (, i j)2

Generalized vector SM Linear shifting of the input vectors modifies their directional characteristics and normalizes their component-wise magnitude differences [ ( pq, ) + γi].[ ( i, j) + γi] ( pq, ) + γi, ( i, j) + γi = 0 = 1 + γi + γi ( pq, ) ( i, j) G 2 Geometric interpretation of 2-D case (, i j ) k chromaticity line component to be calculated available components ( pq, ) k (, i j) (, i j)2 k ( pq, ) ( pq, )2 R (or B) G 2 (, i j) k ( pq, ) k + γ + γ (, i j ) k ( pq, ) ( pq, )2 (, i j)2 + γ + γ R (or B) 2 G ( pq, ) k via γ >> 0 ( pq, ) k via γ = 0 intermediate direction output direction (γ >>0) original direction ( γ = 0 ) k ( pq, )2 R (or B) Generalized vector SM Features universal solution: easy to implement tunes both directional & magnitude characteristics generalizes all previous spectral models: non-shifted vector model normalized color ratio model color ratio model color difference model ( γ = 0, three-component epression) (two-component epression) ( γ = 0, two-component epression) ( γ, two-component epression) Vector SM based data-adaptive estimator (, i j) ( p, q) k = { w ( i, j) ( p, q) k} (, i j) ζ = y γ (, i j) ( pq, ) k

Demosaicking (spectral interpolation) From gray-scale input to full- color output acquired 2 z: Z Z q colored CFA 2 3 : Z Z restored 2 3 y : Z Z p K 1 eq. (1) (for Bayer CFA) color restoration K 2 [ z( pq, ),0,0] for p odd and q even, = [0,0, z ] for p even and q odd, (1) [0, z,0] otherwise ( pq, ) ( pq, ) ( pq, ) Demosaicking (spectral interpolation) Color : only with demosaicking integral processing step in the pipeline should be supported by post processing (correction) demosaicking process (mandatory) correction process (optional) Bayer original R and B CFA data G plane population R plane populated using SM B plane populated using SM Restored color original R and B CFA data G plane corrected via SM correction using R or B color components R plane corrected using SM B plane corrected using SM Corrected color pleasing for viewing demosaicking vs. demosaicked post processing: two fundamentaly different processing steps; they utilize similar, if not identical, signal processing concepts. post processing of demosaicked s: novel application

SM and the ESM vs. color reconstruction quality without SM and ESM omitted SM, used ESM omitted ESM, used SM both SM and ESM used CFA selection vs. demosaicking Impact on quality: quality significantly varies depending on both the CFA and the input content Impact on computational compleity: increased compleity for pseudo-random and random CFAs Bayer CFA offers one of the simplest color reconstruction CFA solution A solution B

Demosaicked post processing Full-color enhancement postprocessing the demosaicked is an optional step implemented mainly in software and activated by the end-user scene optics CFA & sensor A/D demosaicking spectral interpolation CFA (gray-scale data) camera output & quality evaluation color correction & color balancing postprocessing demosaicked (full-color) localizes and eliminates false colors created during demosaicking improves both the color appearance and the sharpness of the demosaicked unlike demosaicking, postprocessing can be applied iteratively until certain quality criteria are met color enhancement postprocessed demosaicked with enhanced quality Demosaicked post processing BI MFI CHI ECI SAIG (a) (b) (c) (d) (e) (f) (a) (b) (c) (d) (e) (f) demosaicked s (top rows), postprocessed s (bottom rows)

Motivation Digital zooming in imaging devices technological advances -> miniaturization of single-sensor cameras pocket devices, mobile phones and PDAs -> low optical capabilities and computational resources to improve functionality and quality of output -> increase the spatial resolution of the camera output Image zooming (spatial interpolation) Zooming in the RGB domain conventionally used slower - more samples to process amplification of the imperfections introduced during demosaicking CFA data demosaicking color zooming zoomed Zooming in the CFA domain novel approach operating on noise-free samples spectral interpolation follows spatial interpolation CFA data CFA zooming demosaicking zoomed

Demosaicked (full-color) zooming Zooming in the RGB domain conventionally used (p-1,q-1) (p-1,q) (p,q) (central sample) lattice 33 supporting window (p+1,q+1) Demosaicked (full-color) zooming Piel arrangements observed during processing (p 1,q 1) (p 1,q+1) (p 1,q) no enough information (p+1,q 1) (p+1,q+1) (p+1,q) (p 1,q) (p,q 1) no enough information (p,q+1) (p,q 1) (p+1,q) (p,q+1)

Demosaicked (full-color) zooming Zooming methods adaptive vs. non-adaptive component-wise vs. vector original component-wise median vector median CFA zooming Filling CFA components conventional approach destroys the underlying CFA structure specially designed filling operations (2p 1,2 q) (2 p,2q 1) ( p, q) (2p 1,2q 1) = b for (odd p, even q) for (even p, odd q) otherwise input Bayer CFA conventional CFA based approach

CFA zooming z 2 G interpolation step z 1 z 4 z 3 z1 z2 z 3 z 4 interpolator edge-sensing weight z ( pq, ) j j j= 1 4 = w z 1 wi = 4 1+ z z j = 1 i j CFA zooming R interpolation steps utilizes both spatial and spectral characteristics spectral quantities are formed using spatially shifted samples 4 wz z z z w z j j 4 j= 1 ( pq, ) = ( pq, 1) + = 4 ( pq, 1) + j = 1 wj j j j = 1 zi = Ri Gi

CFA zooming 4 B interpolation steps wz diagonal symmetry compared to z z z w z R components spectral quantities are formed using spatially shifted samples j j 4 j= 1 ( p, q) = ( p 1, q) + = 4 ( p 1, q) + j = 1 wj j j j = 1 zi = Bi Gi enlarged Bayer CFA Camera zooming combined with demosaicking original s conventional (demosaicked) zooming CFA zooming

Video-demosaicking Essential in single-sensor VIDEO cameras motion video or sequences represent a 3-D signal or a time sequence of 2-D s (frames) motion video usually ehibits significant correlation in both the spatial and temporal sense by omitting the essential temporal characteristics, spatial processing methods, which process separately the individual frames, produce an output sequence with motion artifacts Processing windows: * * * t q p temporal spatial spatiotemporal Spatiotemporal video-demosaicking Fast video-demosaicking procedure usage in PDAs and mobile phone imaging applications utilization of multistage unidirectional spatiotemporal filtering concepts essential spectral quantities formed over the spatiotemporal neighborhood structural content followed by spatiotemporal edge-sensing weights color component to be outputted is obtained via weighted average operations defined over unidirectional demosaicked values

Video-demosaicking original frames restored using spatial BI demosaicking restored using fast spatiotemporal demosaicking Video-demosaicking original frames restored using spatial BI demosaicking restored using fast spatiotemporal demosaicking

Camera indeing Digital rights management in digital cameras: captured s are directly indeed in the single sensor digital camera, mobile phone and pocket device indeing performed by embedding metadata information great importance to the end-users, database software programmers, and consumer electronics manufacturers CFA data registration indeed CFA data capturing device information metadata satellite tracking information.. semantic information Camera indeing Embedding procedure Etraction procedure CFA data metadata indeed demosaicked indeed CFA data encryption or metadata R G G B encrypted metadata R and B CFA component etraction decryption + R CFA samples B CFA samples + indeed data etraction indeed CFA data indeed data etraction

Where to learn more? R. Lukac, B. Smolka, K. Martin, K.N. Plataniotis, and A.N. Venetsanopoulos, "Vector Filtering for Color Imaging," IEEE Signal Processing Magazine, vol. 22, no. 1, pp. 74-86, January 2005. R. Lukac and K.N. Plataniotis, "Fast Video Demosaicking Solution for Mobile Phone Imaging Applications," IEEE Transactions on Consumer Electronics, vol. 51, no. 2, pp. 675-681, May 2005. Where to learn more? R. Lukac and K.N. Plataniotis, "Data-Adaptive Filters for Demosaicking: A Framework," IEEE Transactions on Consumer Electronics, vol. 51, no. 2, pp. 560-570, May 2005. R. Lukac, K. Martin, and K.N. Plataniotis, "Demosaicked Image Postprocessing Using Local Color Ratios," IEEE Transactions on Circuits and Systems for Video Technology, vol. 14, no. 6, pp. 914-920, June 2004.

Where to learn more? R. Lukac and K.N. Plataniotis, "Normalized Color-Ratio Modelling for CFA Interpolation," IEEE Transactions on Consumer Electronics, vol. 50, no. 2, pp. 737-745, May 2004. R. Lukac, K.N. Plataniotis, and D. Hatzinakos, "Color Image Zooming on the Bayer Pattern," IEEE Transactions on Circuits and Systems for Video Technology, to appear, vol. 15, 2005. Where to learn more? Color Image Processing: EMERGING APPLICATIONS Edited by: Rastislav Lukac and Kostas Plataniotis R. Lukac and K.N. Plataniotis, Color Image Processing: Emerging Applications," CRC Press, spring 2006. www.dsp.utoronto.ca/~lukacr/ inde.php?page=research3