Detection of Hue Modification Using Photo Response Non-Uniformity

Similar documents
Camera identification from sensor fingerprints: why noise matters

IMPROVEMENTS ON SOURCE CAMERA-MODEL IDENTIFICATION BASED ON CFA INTERPOLATION

Detecting Resized Double JPEG Compressed Images Using Support Vector Machine

Countering Anti-Forensics of Lateral Chromatic Aberration

Image Manipulation Detection using Convolutional Neural Network

Source Camera Model Identification Using Features from contaminated Sensor Noise

IDENTIFYING DIGITAL CAMERAS USING CFA INTERPOLATION

Exposing Image Forgery with Blind Noise Estimation

Imaging Sensor Noise as Digital X-Ray for Revealing Forgeries

Exposing Digital Forgeries from JPEG Ghosts

Detection of Image Forgery was Created from Bitmap and JPEG Images using Quantization Table

Introduction to Video Forgery Detection: Part I

Retrieval of Large Scale Images and Camera Identification via Random Projections

Watermark Embedding in Digital Camera Firmware. Peter Meerwald, May 28, 2008

Two Improved Forensic Methods of Detecting Contrast Enhancement in Digital Images

Analysis on Color Filter Array Image Compression Methods

Efficient Estimation of CFA Pattern Configuration in Digital Camera Images

CS 365 Project Report Digital Image Forensics. Abhijit Sharang (10007) Pankaj Jindal (Y9399) Advisor: Prof. Amitabha Mukherjee

COLOR LASER PRINTER IDENTIFICATION USING PHOTOGRAPHED HALFTONE IMAGES. Do-Guk Kim, Heung-Kyu Lee

Image Enhancement in Spatial Domain

Image Forgery Identification Using JPEG Intrinsic Fingerprints

SOURCE CAMERA IDENTIFICATION BASED ON SENSOR DUST CHARACTERISTICS

Linear Filter Kernel Estimation Based on Digital Camera Sensor Noise

Fragile Sensor Fingerprint Camera Identification

Forgery Detection using Noise Inconsistency: A Review

ity Multimedia Forensics and Security through Provenance Inference Chang-Tsun Li

ARRAY PROCESSING FOR INTERSECTING CIRCLE RETRIEVAL

A JPEG CORNER ARTIFACT FROM DIRECTED ROUNDING OF DCT COEFFICIENTS. Shruti Agarwal and Hany Farid

An Automatic JPEG Ghost Detection Approach for Digital Image Forensics

Distinguishing between Camera and Scanned Images by Means of Frequency Analysis

Different-quality Re-demosaicing in Digital Image Forensics

IMAGE TAMPERING DETECTION BY EXPOSING BLUR TYPE INCONSISTENCY. Khosro Bahrami and Alex C. Kot

Can We Trust Digital Image Forensics?

2018 IEEE Signal Processing Cup: Forensic Camera Model Identification Challenge

Impeding Forgers at Photo Inception

Camera Model Identification Framework Using An Ensemble of Demosaicing Features

Survey On Passive-Blind Image Forensics

Digital Image Authentication from Thumbnails

Camera identification by grouping images from database, based on shared noise patterns

Zero-Based Code Modulation Technique for Digital Video Fingerprinting

A STUDY ON THE PHOTO RESPONSE NON-UNIFORMITY NOISE PATTERN BASED IMAGE FORENSICS IN REAL-WORLD APPLICATIONS. Yu Chen and Vrizlynn L. L.

INFORMATION about image authenticity can be used in

Application of Histogram Examination for Image Steganography

Image Processing for feature extraction

Nonuniform multi level crossing for signal reconstruction

Edge Potency Filter Based Color Filter Array Interruption

Passive Image Forensic Method to detect Copy Move Forgery in Digital Images

Midterm Examination CS 534: Computational Photography

COLOR IMAGE STEGANANALYSIS USING CORRELATIONS BETWEEN RGB CHANNELS. 1 Nîmes University, Place Gabriel Péri, F Nîmes Cedex 1, France.

Image Filtering. Median Filtering

PRIOR IMAGE JPEG-COMPRESSION DETECTION

Target detection in side-scan sonar images: expert fusion reduces false alarms

Image Tampering Localization via Estimating the Non-Aligned Double JPEG compression

Image Manipulation Detection Using Sensor Linear Pattern

A Novel Method for Enhancing Satellite & Land Survey Images Using Color Filter Array Interpolation Technique (CFA)

Dr. Kusam Sharma *1, Prof. Pawanesh Abrol 2, Prof. Devanand 3 ABSTRACT I. INTRODUCTION

Automatic source camera identification using the intrinsic lens radial distortion

Digital Image Processing. Lecture # 6 Corner Detection & Color Processing

S SNR 10log. peak peak MSE. 1 MSE I i j

ADVANCES in digital imaging technologies have led to

Automation of JPEG Ghost Detection using Graph Based Segmentation

ABC: Enabling Smartphone Authentication with Built-in Camera

AN OPTIMIZED APPROACH FOR FAKE CURRENCY DETECTION USING DISCRETE WAVELET TRANSFORM

Detection of Misaligned Cropping and Recompression with the Same Quantization Matrix and Relevant Forgery

Variable Step-Size LMS Adaptive Filters for CDMA Multiuser Detection

Local prediction based reversible watermarking framework for digital videos

VLSI Implementation of Impulse Noise Suppression in Images

An Efficient Color Image Segmentation using Edge Detection and Thresholding Methods

Correction of Clipped Pixels in Color Images

Supplementary Materials for

Digital Halftoning. Sasan Gooran. PhD Course May 2013

ECC419 IMAGE PROCESSING

Image De-Noising Using a Fast Non-Local Averaging Algorithm

On the usage of Sensor Pattern Noise for Picture-to-Identity linking through social network accounts

A Weighted Least Squares Algorithm for Passive Localization in Multipath Scenarios

Miscellaneous Topics Part 1

AN IMPROVED NO-REFERENCE SHARPNESS METRIC BASED ON THE PROBABILITY OF BLUR DETECTION. Niranjan D. Narvekar and Lina J. Karam

Forensic Hash for Multimedia Information

Exposing Image Splicing with Inconsistent Local Noise Variances

Splicing Forgery Exposure in Digital Image by Detecting Noise Discrepancies

A DUAL TREE COMPLEX WAVELET TRANSFORM CONSTRUCTION AND ITS APPLICATION TO IMAGE DENOISING

Forensic Framework. Attributing and Authenticating Evidence. Forensic Framework. Attribution. Forensic source identification

Global Contrast Enhancement Detection via Deep Multi-Path Network

Color image Demosaicing. CS 663, Ajit Rajwade

Laser Printer Source Forensics for Arbitrary Chinese Characters

AUTOMATIC DETECTION AND CORRECTION OF PURPLE FRINGING USING THE GRADIENT INFORMATION AND DESATURATION

Correlation Based Image Tampering Detection

DIGITAL IMAGE PROCESSING Quiz exercises preparation for the midterm exam

Sensors and Sensing Cameras and Camera Calibration

Iris Recognition using Histogram Analysis

Improved Detection of LSB Steganography in Grayscale Images

Journal of mathematics and computer science 11 (2014),

Virtual Restoration of old photographic prints. Prof. Filippo Stanco

Watermarking-based Image Authentication with Recovery Capability using Halftoning and IWT

Applying the Sensor Noise based Camera Identification Technique to Trace Origin of Digital Images in Forensic Science

Images and Graphics. 4. Images and Graphics - Copyright Denis Hamelin - Ryerson University

Bogdan Smolka. Polish-Japanese Institute of Information Technology Koszykowa 86, , Warsaw

A Spatial Mean and Median Filter For Noise Removal in Digital Images

Image Forgery Detection Using Svm Classifier

Camera Image Processing Pipeline

Transcription:

The final version of record is available at http://dx.doi.org/.9/tcsvt.6.53988 Detection of Hue Modification Using Photo Response Non-Uniformity Jong-Uk Hou, Student Member, IEEE, and Heung-Kyu Lee Abstract Hue modification is a common strategy used to distort the true meaning of a digital image. In order to detect this kind of image forgery, we proposed a robust forensics scheme for detecting hue modification. First, we pointed out that separated photo response non-uniformity (PRNU) by a color filter array forms a pattern independent of others, since the position of each PRNU pixel means they do not overlap. Using PRNUs from each color channel of an image, we designed a forensic scheme for estimating hue modification. We also proposed an efficient estimation scheme and an algorithm for detection of partial manipulation. The results confirmed that the proposed method distinguishes hue modification and estimates the degree of change; moreover, it is resistant to the effects of common image processing. Index Terms Digital image forensics, Photo response nonuniformity (PRNU), Sensor pattern noise, Hue, Color filter array I. INTRODUCTION Since the advent of high-quality low-cost, easily accessible image editing tools, digital images can be easily modified not only by highly trained professionals, but also by most average digital camera users. One of the common strategies image pirates use with digital images is hue modification. Hue is a main property of a color; therefore, counterfeiters who attempt to tamper with the color attribute most commonly tamper with the hue. With an image-editing tool, a person can severely distort the actual meanings of images by modifying the hue of the images. In second-hand markets such as ebay and Amazon.com, counterfeiters can take unfair profits by changing the color of their merchandise. In addition, with severe hue modification, media may broadcast perverted versions of a particular accident by changing the hue of the images that were shot at the accident site. For example, the German-language daily tabloid, Blick, forged an image by changing the color of the flooding water to blood red so that it appeared to be blood, and then distributed the falsified image to news channels. Manuscript received November 9, 5; revised January 5, 6; accepted January 3, 6. Date of publication xx xx, 6; date of current version Feburary 4, 6. This research project was supported by Ministry of Culture, Sports and Tourism(MCST) and from Korea Copyright Commission in 5. The work of Jong-Uk Hou was supported by a Global PH.D Fellowship Program through the National Research Foundation of Korea(NRF) funded by the Ministry of Education (5HAA375). (Corresponding Author: H.-K. Lee.) The authors are with the School of Computing, Korea Advanced Instituted of Science and Technology, Daejeon, Republic of Korea, e-mail: juheo@mmc.kaist.ac.kr; hklee@mmc.kaist.ac.kr. Color versions of one or more of the figures in this paper are available online at http://ieeexplore.ieee.org. Digital Object Identifier.9/TCSVT.6. - To cope with image forgeries, a number of forensic schemes were proposed recently. Most schemes are based on detecting local inconsistencies such as resampling artifacts [], color filter array interpolation artifacts [], JPEG compression [3], or lighting condition [4], and so on. The pixel photo response non-uniformity (PRNU) is also widely used for detecting digital image forgeries [5] [9]. There are also some methods for detecting identical regions caused by copy-move forgery [] [3]. However, only two of these methods [], [4] are able to detect hue forgery, because this does not change any other aspect of an image, including edges, shapes, gradations, and PRNU. Choi et al. [] first proposed an algorithm for estimating hue modification using a neighboring correlation [5] induced by a color filter array (CFA) in a digital camera. They proposed a simple measure of changes in CFA patterns based on counting the number of pixels whose neighbors were satisfied with an interpolation condition. However, this algorithm loses its accuracy after common image processing (e.g., resizing or JPEG compression) that completely breaks down the demosaicing trace of an original image. Moreover, Choi s algorithm only works with image data processed using a CFA Bayer pattern [6] for which the pattern configuration is already known. This paper succeeds our previous idea [4] which described a naive forensics scheme for estimating the degree of hue modification based on PRNU. The advanced points of this paper are as follows. First, we propose an efficient estimation scheme that skips some unnecessary intervals based on our hue modification modeling. Second, we proposed a local forgery detector in which two threshold-values are adopted to determine the forged regions. Finally, we modeled the distribution of the test results, analyzed the error ratio, and proposed an equation with two thresholds to reduce the falsepositive ratio. The rest of this paper is organized as follows. In Section, we explain the details of the proposed method. Then, we test our method with various image data sets in Section 3, and in Section 4, we present our conclusions. II. PROPOSED METHOD In this section, we describe a separated PRNU created by CFA that forms a pattern independent of the others. Based on our previous idea [4], we propose an enhanced algorithm for estimating hue modification using separated PRNU, as well as an algorithm for detection of partial-manipulation in which two threshold-values are adopted to reduce the false positive ratio. We now describe this process in detail. 5-85 c 6 IEEE. Personal use of this material is permitted. However, permission to use this material for any other purposes must be obtained from the IEEE by sending an email to pubs-permissions@ieee.org. Copyright (c) 6 IEEE. Personal use is permitted. For any other purposes, permission must be obtained from the IEEE by emailing pubs-permissions@ieee.org.

The final version of record is available at http://dx.doi.org/.9/tcsvt.6.53988 A. Reference patterns for each color channel The raw output of the image sensor was separated into three color components by a color filter array [7]. The PRNU, the unique fingerprint of a digital camera, was also separated using this process. Fig. shows an example of PRNU separation by the Bayer pattern [6], the most widely used pattern in digital cameras. Each separated PRNU forms a pattern independent of the others, because the positions of each PRNU do not overlap. For example, as shown in Fig., noise residual N is included in the red channel, but not in the blue or green channels. Therefore, the PRNU of each color channel represents an unique characteristic of each untampered color channel. In order to extract the PRNU, Lukas et al. [8] proposed a scheme to obtain an approximation of the PRNU using a wavelet-based denoising filter. Using this method, we can obtain the PRNU of each color channel P c by averaging multiple untampered images I c (k) where k =,..., N p and c {r, g, b}, and where r, g and b denote the red, green, and blue color channels, respectively. The obtained PRNU P c for each color channel c is used as a color-reference pattern, which represents a unique characteristic of each untampered color channel. We noted that our method does not require a priori knowledge about the CFA color configuration similar to [9]. As with other PRNU-based methods [5] [8], reference patterns should be generated using the same camera used to take the suspicious image. B. Proposed estimation algorithm Naive estimation process described in [4] is based on a exhaustive search algorithm that is very time consuming. Therefore, we propose an enhanced scheme for efficiently estimating the hue modification. Using this search scheme, we can estimate hue modification without checking every possible case. Below, we will explain our proposed algorithm in detail. ) Modeling a hue modification effect of the PRNU: Each RGB pixel value of an image can be represented as a vector with three-dimensional Cartesian coordinates. In this representation of the pixel, a hue modification process can be defined as a rotation of a RGB pixel vector, while preserving its magnitude. Given a unit vector u = (u r, u g, u b ) where u r = u g = u b = 3, the matrix R(θ) for a rotation by an angle of θ about an axis in the direction of u is provided by Eq.. R(θ) = α + β β γ β + γ β + γ α + β β γ β γ β + γ α + β () where α = cosθ, β = 3 ( cosθ), and γ = 3 sinθ []. Multiplying this matrix by the original pixel vector v = (r, g, b) T, we obtained a hue-modified pixel vector v = (r, g, b ) T. In order to analyze a hue modification effect on the separated PRNU, we analyzed this modification in terms of hue rotations in RGB color space. First, we defined three pixel vectors v r = [p r, p g, p b ] T, v g = [ p r, p g, p b ] T, and v b = [ p r, p g, p b ] T defined in RGB color space, which represent pixels from the red, green, and blue positions, respectively, of the CFA. p c represents the pixel-value measured directly from the image sensor, and p c represents the pixel value generated by the demosaicing process. To represent meaningful PRNU components, we removed the interpolated value p c which has propagation error by demosaicking [9], and obtain normalized PRNU noise pixels as follows: η r = [,, ] T, η g = [,, ] T, and η b = [,, ] T. Then, we applied the multiplication of rotation matrix R(θ) to each of the pixel vectors η r, η g, and η b to simulate hue modification, and we obtained the following hue-modified pixel vectors. α + β β γ β + γ η r = β + γ, η g = α + β, η b = β γ, β γ β + γ α + β () where θ is the degree of hue modification. Using these pixel vectors, we generated a 3 synthetic-noise-image I c =[ η r, η g, η b ] and a hue-modified synthetic-noise-image I c(θ)=[ η r (θ), η g (θ), η b (θ) ]. Then, we calculated the correlation ρ c(θ) as follows: ρ c(θ) = corr(i c (θ), I c(θ)). (3) Fig. shows the calculated correlation values for all huerotation from a real data set (N p = 3, Nikon D9), and for the proposed model. Here, we observed structural similarity between the proposed model c ρ c(θ) and c ρ c(θ) from the real data set. 3.. -8 - -6 (θ) 6 8. -8 - -6 (θ) 6 8 (a) (b) Fig.. PRNU separation by CFA Bayer pattern Fig.. Calculated correlation values for all hue rotation : (a) measured from the proposed model ( c ρ c(θ)), (b) measured from the real image ( c ρc(θ)). Both graphs have similar shape. Copyright (c) 6 IEEE. Personal use is permitted. For any other purposes, permission must be obtained from the IEEE by emailing pubs-permissions@ieee.org.

The final version of record is available at http://dx.doi.org/.9/tcsvt.6.53988 3 ) Design a hue modification algorithm: From the previously described analysis, we observed that correlation graphs measured from the real data set and those produced by the proposed model are symmetric with respect to the maximum point, and to the bell shape. Slopes around the maximum point converge towards it. Based on these features, we designed an algorithm to estimate hue-modification without checking the degrees of all hue. The proposed algorithm consists of two steps. Step. The first step reduces computation time by removing unnecessary search intervals. This is based on a small number of samples. This is similar to the naive scheme but much faster since the search range is divided by the regular interval i. First, we selected n s = 36 i samples at regular interval i in the range of hue value (, 36], and we created a set of selected hue values Θ = {θ, θ,..., θ ns }. Then, the hue value of the suspicious image I sus was modified with the values in Θ, and we obtained a huemodified image set I sus ={I sus (θ ), I sus (θ ),..., I sus (θ ns )}. Applying denoising filter F into the I sus, we obtained a set I Θ sus={n c (θ ), n c (θ ),..., n c (θ ns )} of noise residuals. After that, we calculated the cross-correlation between noise residuals of I Θ sus and P c from a source camera I sus, and we choose θ with the largest value as an intermediate candidate of modified hue degree. θ = argmax( θ c corr(n c (θ), P c ) ), where θ Θ. (4) Step. In the second step, we set the [Min d, Max d ] for the candidate range where Min d θ i/ and Max d θ + i/. If there is a global maximum ˆθ in the range [Min d, Max d ], we can find ˆθ using an algorithm based on a hill climbing search since the slopes around the maximum point converge towards it. Algorithm Min d θ i/ Max d θ + i/ itv i ɛ smallest unit of hue modification while ( itv > ɛ ) do LeftEstimation c corr(nc(min d + itv/4), P c) RightEstimation c corr(nc(max d itv/4), P c) if LeftEstimation > RightEstimation then Max d Max d itv/ else Min d Min d + itv/ end if itv itv/ end while Return ˆθ mod(36 Max d+min d, 36) The algorithm for the second step is shown below in Algorithm. The design of Algorithm comes from the concept of a binary search tree with hill-climbing optimization. First, we calculated correlation values in the intermediate angles to the left and right sides of candidates. Second, we narrowed the search range toward the direction of greater value and reduced the interval by half. These iterations were repeated until intervals smaller than ɛ. The estimated degree ˆθ was calculated based on the final candidate range [Min d, Max d ]. ˆθ mod(36 Max d + Min d, 36) (5) The estimated degree ˆθ indicates the degree of the modified hue. For example, if ˆθ is zero, the suspicious image I sus did not undergo any hue manipulation. In contrast, if ˆθ is not zero, the hue of the I sus was modified. C. Partial-manipulation detection algorithm In this section, we describe how we detected partialmanipulation of the hue based on the proposed estimator. To detect hue-modified areas, we defined a w w-pixel sliding window that moved across the reference pattern P and suspicious image I sus. Using this window, we obtain the block I(i, j) to be investigated and its corresponding reference pattern P (i,j) via the following equations. P (i,j) = P [n][m], (6) I sus (i, j) = I sus [n][m], (7) where i w n, m i + w, and [n][m] denotes the image pixel from the n-th row and the m-th column. Using the proposed estimator F (P, I), we obtained an estimation map M θ of I sus. M θ (i, j) = F (P (i,j), I sus (i, j)). (8) If the inspected image has been partially tampered, the estimation degree of the tampered region will not have a value of zero. The estimation result M θ (i, j) might contain some false positive errors, since regions in which the PRNU is naturally suppressed cause false positives [6]. Therefore, we needed to check not only the hue degrees but also the correlation values to reduce false positives. First, we calculate the correlation map M ρ (i, j) using M θ (i, j). M ρ (i, j) = c corr(η c (i,j) (M θ (i, j)), P c (i,j) ). (9) where c {r, g, b}, and η c (i,j) (θ) is the noise residual component of the I sus. If the pixel (i, j) has not sufficient noisestrength for detection, the estimation result M θ (i, j) will not be accurate and should be regarded as a false positive. Therefore, we determined whether the pixel (i, j) was modified using: {, θ M θ (i, j) τ θ, and M ρ(i, j) τ ρ Z(i, j) =, otherwise () where Z(i, j) indicates whether the hue of each pixel is modified or not, and τ θ, and τ ρ are difference threshold for the hue-modification degree and threshold for the correlation, respectively. The operator denotes an absolute value in the Copyright (c) 6 IEEE. Personal use is permitted. For any other purposes, permission must be obtained from the IEEE by emailing pubs-permissions@ieee.org.

The final version of record is available at http://dx.doi.org/.9/tcsvt.6.53988 4 range (, 36], and θ indicates the unmodified degree value,. In Eq., the value could be replaced with the estimation value M θ (i, j). Specific implementation details for this algorithm are described in Section III-C. III. EXPERIMENTAL RESULTS For the experiment, we used sample raw images collected from the Dresden Image Database [] for digital image forensics. The rest of the raw images were taken directly using the camera models listed in Table I, which also depicts the details of the image database. All raw images were interpolated by dcraw, the most widely used raw-image decoder, using adaptive homogeneity-directed (AHD) interpolation algorithms []. For evaluating accuracy, the root mean square error (RMSE) E((ˆθ θ) )) where θ and ˆθ represent the was calculated as actual and estimated-degree of hue modification, respectively. RMSE serves to aggregate the magnitudes of the errors in estimations for various experiments. Generally, RMSE values around 3. indicate the estimations of high accuracy (>99.%). A. Hue modification estimation result ) Reference Pattern: For evaluation of our method, we used various kinds of source images to generate reference patterns. Table II lists the source-image description for each reference pattern. We use not only flat images (such as bluesky images) but also general images taken in a wide variety of indoor and outdoor scenes, to evaluate our method in various scenarios. Fig.3(a) illustrates the plot of the RMSE versus the different number of images N p for each reference pattern. Experiments with RP showed the best result even for the case where N p < 5. The results for RP were slightly lower than those of RP, but still showed good performance. RP3 performed worst since JPEG compression damages to the high-frequency component of the target image (it includes PRNU component). TABLE I DIGITAL CAMERA MODELS IN OUR EXPERIMENTS AND ESTIMATION RESULTS FOR EACH CAMERA MODEL ID Model #images Resolution Mean S.D. RMSE D Nikon D 394 66 -.549.79.858 D7 Nikon D7 34 4 -.63.54.66 D7s Nikon D7s 34 4 -.35 3.33 3.643 D9 Nikon D9 435 868 -.54.77.85 E4 Olympus E4 37 8 -.58.37.43 TABLE II SOURCE IMAGE DESCRIPTION FOR EACH TYPE OF REFERENCE PATTERN ID Source image description Camera RP Flat images D9 RP General images D9 RP3 General images with JPEG 95 D9 RMSE (degree) Estimation time ( 4 second) 5 5 5 4 6 8 4 4 8 6 4 (a) RMSE 5 3 6 9 sampling interval i (b) RP RP RP3 Estimation time (Np) Fig. 3. (a) Estimated RMSE of the degree for each reference pattern and the number of images N p for reference pattern, and (b) RMSE and computation time results for the fast algorithm ) Evaluation of the fast algorithm: To evaluate the fast algorithm described in Section.II-B, the computation time and accuracy were measured at various i. Tests were conducted using test images taken with the Nikon D9, and used with RP (N p = 3). A computer based on Intel i7-377 (3.4 GHz) CPU with 6 GB main memory was used to measure the performance. The hue of the sample images was shifted randomly to generate test images. Fig.3(b) shows the computation time and accuracy of the fast algorithm. Note that the result with i = is the same as the result with the naive scheme [4]. When i increased, computation time dramatically decreased but lost estimation accuracy. Even so, RMSE values around 3. are still sufficient to detect hue modification. To analyze the computational complexity, we define ɛ as the unit size of hue modification. We can describe the size of the hue changeable range as n = 36/ɛ. Then, the time complexity of the naive scheme is T O(n) where the symbol T denotes the total processing complexity of the noise extraction and correlation calculation time from the suspicious image I sus. In contrast, the time complexity of the fast algorithm is T O( 36 i + log( i)) that is much faster than the naive scheme. 3) Camera Model: In order to examine the influence of the types of image sensors on the proposed method, we conduct tests with the camera models described in Table I. Table I 4 8 6 4 RMSE Copyright (c) 6 IEEE. Personal use is permitted. For any other purposes, permission must be obtained from the IEEE by emailing pubs-permissions@ieee.org.

The final version of record is available at http://dx.doi.org/.9/tcsvt.6.53988 5 TABLE III ESTIMATED MEAN, STANDARD DEVIATION, AND RMSE FOR EACH HUE INTERVAL Hue θ 6 8 4 3 Mean -.549 59.35 9.4 79.44 39.4 99.5 S.D..79.799.795.79.793.794 RMSE.833.859.863.855.859.86 TABLE IV RMSE FOR IMAGE SIZE RSME (degree) 4 8 6 4 Proposed method (N p = 3) Proposed method (N p = 3) Choi s method N p 3 3 64 64 8 8 56 56 5 5 4 4 3 66.84 53.485 36.83 7.646 7.537 4.3 3 45.35 34.39 3.5 7.867 5.3 3.5 depicts the details of the estimation results: estimated mean, and RMSE. According to the results, the estimation accuracy was good with the D, D7, D9, and E4. Estimation result for D7s was relatively less good and its mean was slightly biased, but it still showed high accuracy (around 99%). 4) Hue degree: To investigate the influence of the modifiedhue degree, the values of RMSE were tested for each huedegree. For estimation, sample images from the Nikon D9 and reference pattern RP (N p = 3) were used. The hue of the sample images was shifted from to 33 in steps of 3, to generate test images. As can be seen in Table III, the mean of the estimated degree is similar to the degree of the actual modification. Furthermore, the standard deviation and RMSE do not depend on the degree of hue modification. 5) Block size: In this experiment, the center regions of the sample images and reference patterns were cropped to various block sizes. Using the cropped block as an input image, we evaluate the performance of the proposed method. Table IV shows the RMSE of each block size. The RMSE values increased when the block size decreased, because small blocks included less PRNU than did bigger blocks. This result has a relation with the performance of the partial-manipulation detector, because the w w-size block is directly associated with the w w-pixel sliding window. As the window size increased, the accuracy for hue estimation also increased, but the distinguishing resolution of the forged region became less accurate. B. Comparison results for various attacks The estimation results of the proposed method were compared to those of Choi s method [] with interval factor s =. The proposed method was performed with reference pattern type RP where N p = 3 and 3. ) Image resizing: Fig.4(a) demonstrates the RMSE values for different image scaling factors. Reference patterns were pre-processed at the same sizes as the suspicious images before applying our proposed method, and the input images for Choi s method were restored to the original size for a fair comparison. The results of both methods are qualitatively similar in the case of the scaling ratio.. However, Choi s method did not work with any scaling ratio apart from the original size. In contrast, the results of the proposed method are acceptable with a scaling ratio of.5 and higher. RSME..4.6.8...4 Scaling ratio 5 5 (a) Proposed method (N p = 3) Proposed method (N p = 3) Choi s method 3 4 5 6 7 8 9 9 9 93 94 95 96 97 98 99 * JPEG compression quality factor (b) Fig. 4. Estimated RMSE of the degree for (a) image resizing and (b) JPEG compression quality (* : uncompressed) ) JPEG compression: We compressed the test images by varying the JPEG quality factor from to, and tested the proposed method for estimating hue modification. Fig.4(b) shows the RMSE values for different JPEG compression qualities. The performance of the proposed method for images with a JPEG quality factor upwards of 95 was as good as the result for uncompressed images. The results in cases with quality factor between 8 and 95 were also acceptable. On the other hand, even with tests with a high JPEG quality factor (e.g., 98 and 97), Choi s method [] demonstrated relatively poor performance for estimating hue modification. For quality factors less than 95, Choi s method did not work. There is an interesting performance in Fig. 4(b) when the JPEG quality factor grows to 3, the proposed method outputs reverse, increasing for N p = 3 and decreasing for N p = 3, compared to the entire trend in variation. At JPEG quality factors less than 5, estimation degrees from some images (especially dark and highly textured) were completely random since PRNU components were incompletely extracted from those images. Therefore, we concluded that those outliers made the results assume an odd form. Copyright (c) 6 IEEE. Personal use is permitted. For any other purposes, permission must be obtained from the IEEE by emailing pubs-permissions@ieee.org.

This is the author's version of an article that has been published in this journal. Changes were made to this version by the publisher prior to publication. The final version of record is available at http://dx.doi.org/.9/tcsvt.6.53988 6 (a) Image I. (b) Image I. (c) Mθ (i, j) for I. (d) Mθ (i, j) for I. (e) Z(i, j) for I. (f) Z(i, j) for I. (g) Image II. (h) Image II. (i) Mθ (i, j) for II. (j) Mθ (i, j) for II. (k) Z(i, j) for II. (l) Z(i, j) for II. (m) Image III. (n) Image III. (o) Mθ (i, j) for III. (p) Mθ (i, j) for III. (q) Z(i, j) for III. (r) Z(i, j) for III. Fig. 5. Sample images for the partial-manipulation detection experiment and its estimation map Mθ (i, j) and detection map Z(i, j). To show the estimated hue degree, degrees are mapped to the hue of an HSV image where saturation is and brightness is.5. In the detection map Z(i, j), forged regions are mapped to the estimated hue value and unmodified regions are represented as black pixels. C. Detection of partial-manipulation In this section, we tested the the algorithm for detecting partial-manipulation. Using Adobe Photoshop CS6, we modified the hue of three sample images taken with the Nikon D9. As shown in Fig.5, the hue of the target objects, such as smartphone case, were manipulated to distort the meaning of the images. ) Reduce false positive errors: To reduce the detection errors described in Section II-C, we modeled the distribution of our estimation results using the two-dimensional Gaussian distribution model as follows: f (x, y) = A exp( ( (x µx ) (y µy ) + )), σx σy () the mean and standard deviation of x, and the coefficient A is the amplitude of the distribution. The distribution of correlation values follows the generalized Gaussian distribution as discussed elsewhere [6]. We experimentally decided that the distribution of the estimated hue degrees also follows the Gaussian distribution. Fig.6 shows the 3-dimensional histogram of bivariate data [Mθ, Mρ ] obtained from Images I. and I.. To visualize histogram, range of the estimated hue was changed into [ 8, 8]. We intentionally select the small Np (=5, RP) to show a bar of the false positives in the histogram. We can observe a small number of false positives from Mθ (i, j) > τθ in the red circle areas in Fig.6(a). For reader s reference, we can also observe false positive regions in the top of Fig.5(o). where the x-axis and y-axis indicate the estimated hue degree and the correlation value, respectively. Here, µx and σx denote Generally, these kind of false positives occurred because of inaccurately extracted PRNU, as discussed in Section II-C. (a) Image I. (b) Image I. Fig. 6. Three-dimensional histogram of bivariate data [Mθ, Mρ ]: (a) Obtained from original image (Image I.), (b) Obtained from a forged image (Image I.). To visualize the histogram, the range of the estimated hue was changed to [ 8, 8]. Copyright (c) 6 IEEE. Personal use is permitted. For any other purposes, permission must be obtained from the IEEE by emailing pubs-permissions@ieee.org.

The final version of record is available at http://dx.doi.org/.9/tcsvt.6.53988 7 Thus, we adopted the threshold τ θ for estimated hue and the correlation threshold τ ρ was defined as follows: τ θ = µ x t x σ x, τ ρ = µ y t y σ y, () where t x and t y are the predetermined values used to adjust the probability of false-positive error. Threshold τ θ and τ θ are illustrated as red dashed-lines, and τ ρ illustrated as red line in Fig.6(b). Therefore, we obtained the reducible false-positive rate P with τ θ and τ ρ as follows: P = τρ τθ ( f(x, y) dx f(x, y) dx)dy. τ θ (3) In this experiment, we used t x, t y =.4 to obtain P = 3. We note that the inaccurate PRNU causes false negatives with other methods [5] [7], [9] as opposed to false-positives with our method. ) Partial-manipulation detection results: For these experiments, RP (N p = 3) reference patterns were used. We selected the 56 56 sliding window because of its relatively good estimation performance. Fig.5 shows the estimated hue angle M θ (i, j) and detection result Z(i, j) of the partially manipulated images. To show the estimated hue angle, the estimated degree is mapped to the hue of an HSV image where saturation is, and brightness is.5. In the detection map Z(i, j), manipulated regions are mapped to the estimated hue value, and unmodified regions are represented as black pixels. We removed all the connected tampered regions from Z that contain fewer than 8 8. Using the estimated hue values in the detection map, we could restore the hue-modified image to an image similar to the original color. For example, we can determine that the color of the car in Fig.5(n) was blue using the estimated hue illustrated in Fig.5(r). The effectiveness of the partial manipulation detection algorithm is guaranteed by the assumption that the image undergoes hue modification and that the PRNU noise is not distorted by other forms of image manipulation. Theoretically, any invariant image feature could be used to detect the relevant image modifications, but practically, these are infeasible because invariant features such as PRNU noise, may be distorted by other unknown manipulations. Even if we know the way and the order of the image modifications, the complexity of detection will increase dramatically. Therefore, this issue should be resolved by combination with other PRNU-based methods such as [6], [8]. The PRNU distorted regions can be detected by other PRNU based methods, since we already have a reference pattern for our detector. On the other hand, our method regards these regions as false positives by reference to the threshold τ ρ and Equation (3). To enhance the performance of the local manipulation detector, we could adopt the following methods to improve the quality of PRNU noise in our scheme. First, Li [3] proposed an approach for attenuating the influence of details from scenes on PRNUs. And Li et al. [9] studied the potential correlation between the quality of PRNU noise and the vignetting effect. True Positive Rate True Positive Rate.8.6.4..8.6.4 Proposed Choi s Chen s..4.6.8 False Positive Rate (a) No compression Proposed. Choi s Chen s..4.6.8 False Positive Rate (c) Scaling.9 True Positive Rate True Positive Rate.8.6.4 Proposed. Choi s Chen s..4.6.8 False Positive Rate.8.6.4. (b) JPEG 95 Proposed Choi s Chen s..4.6.8 False Positive Rate (d) Scaling.5 / JPEG 95 Fig. 7. Receiver operating characteristic (ROC) curves of the proposed, Choi s [], and Chen s [6] method. Results are compared with various attacks. Lin et al. [4] proposed a preprocessing approach for attenuating the influence of the non-unique artifacts on the reference pattern to reduce the false identification rate. Considering these aspects of PRNU noise, we could improve our partialmanipulation detector. 3) Comparison results for various attacks: The results of the proposed method were compared to those of Choi et al. [] ( s =, 3 3 block size) and Chen et al. [6] (widow size = 56 56). Reference pattern RP(N p = 3) was used for the proposed method, and by Chen et al. [6]. For the method of Chen et al., the estimated reference pattern and noise residual extracted from each color channel were combined into a grayscale image using the linear combination described earlier [6]. Fig. 7 reports receiver operating characteristic (ROC) curves of the proposed, Choi s [], and Chen s [6] methods. We used the test images shown in Fig.5 with JPEG compression and scaling. Choi s method performed better than the proposed method with uncompression images (Fig.7(a)). However, it lost accuracy after JPEG compression, and it completely malfunctioned in any case with resized images, even when the scaling ratio was around.. Chen s method did not detect hue modification in all cases. IV. CONCLUSIONS In this paper, we discusses a family of photo response nonuniformity (PRNU)-based image manipulation detectors, well able to localize hue modification regions and estimate the modified degree even after arbitrary common image processing. Using separated PRNU of an image, we designed a forensic scheme for estimating the degree of hue modification. We also proposed an efficient estimation scheme and an algorithm for detection of partial manipulation. This method achieved robust Copyright (c) 6 IEEE. Personal use is permitted. For any other purposes, permission must be obtained from the IEEE by emailing pubs-permissions@ieee.org.

The final version of record is available at http://dx.doi.org/.9/tcsvt.6.53988 8 hue forgery detection resistance to effects caused by common image processing, which was not achieved in a previous forgery detector [], [6]. We plan to extend the method to estimation of other types of image property modification such as white balancing, and saturation. REFERENCES [] A. Popescu and H. Farid, Exposing digital forgeries by detecting traces of resampling, Signal Processing, IEEE Transactions on, vol. 53, no., pp. 758 767, Feb 5. [] C.-H. Choi, H.-Y. Lee, and H.-K. Lee, Estimation of color modification in digital images by CFA pattern change, Forensic Science International, vol. 6, pp. 94 5, 3. [3] H. Farid, Exposing digital forgeries from jpeg ghosts, Information Forensics and Security, IEEE Transactions on, vol. 4, no., pp. 54 6, March 9. [4] M. Johnson and H. Farid, Exposing digital forgeries in complex lighting environments, Information Forensics and Security, IEEE Transactions on, vol., no. 3, pp. 45 46, Sept 7. [5] J. Lukáš, J. Fridrich, and M. Goljan, Detecting digital image forgeries using sensor pattern noise, ser. SPIE Conference Series, vol. 67, Feb. 6, pp. 36 37. [6] M. Chen, J. Fridrich, M. Goljan, and J. Lukas, Determining image origin and integrity using sensor noise, Information Forensics and Security, IEEE Transactions on, vol. 3, no., pp. 74 9, March 8. [7] G. Chierchia, G. Poggi, C. Sansone, and L. Verdoliva, A bayesian-mrf approach for prnu-based image forgery detection, Information Forensics and Security, IEEE Transactions on, vol. 9, no. 4, pp. 554 567, April 4. [8] G. Chierchia, S. Parrilli, G. Poggi, C. Sansone, and L. Verdoliva, On the influence of denoising in prnu based forgery detection, in Proceedings of the Nd ACM Workshop on Multimedia in Forensics, Security and Intelligence, ser. MiFor. New York, NY, USA: ACM,, pp. 7. [9] C.-T. Li and R. Satta, On the location-dependent quality of the sensor pattern noise and its implication in multimedia forensics, in Imaging for Crime Detection and Prevention (ICDP ), 4th International Conference on, Nov, pp. 6. [] X. Pan and S. Lyu, Region duplication detection using image feature matching, Information Forensics and Security, IEEE Transactions on, vol. 5, no. 4, pp. 857 867,. [] V. Christlein, C. Riess, J. Jordan, C. Riess, and E. Angelopoulou, An evaluation of popular copy-move forgery detection approaches, Information Forensics and Security, IEEE Transactions on, vol. 7, no. 6, pp. 84 854,. [] S.-J. Ryu, M. Kirchner, M.-J. Lee, and H.-K. Lee, Rotation invariant localization of duplicated image regions based on zernike moments, Information Forensics and Security, IEEE Transactions on, vol. 8, no. 8, pp. 355 37, Aug 3. [3] D. Cozzolino, G. Poggi, and L. Verdoliva, Efficient dense-field copy move forgery detection, Information Forensics and Security, IEEE Transactions on, vol., no., pp. 84 97, 5. [4] J.-U. Hou, H.-U. Jang, and H.-K. Lee, Hue modification estimation using sensor pattern noise, in Image Processing (ICIP), 4 IEEE International Conference on, Oct 4, pp. 587 59. [5] C.-H. Choi, J.-H. Choi, and H.-K. Lee, Cfa pattern identification of digital cameras using intermediate value counting, in Proceedings of the thirteenth ACM multimedia workshop on Multimedia and security, ser. MM&Sec. New York, NY, USA: ACM,, pp. 6. [6] B. E. Bayer, Color imaging array. U.S. Patent 39765, 976. [7] J. Nakamura, Image Sensors and Signal Processing for Digital Still Cameras. Boca Raton, FL, USA: CRC Press, Inc., 5. [8] J. Lukas, J. Fridrich, and M. Goljan, Digital camera identification from sensor pattern noise, Information Forensics and Security, IEEE Transactions on, vol., no., pp. 5 4, 6. [9] C.-T. Li and Y. Li, Color-decoupled photo response non-uniformity for digital image forensics, Circuits and Systems for Video Technology, IEEE Transactions on, vol., no., pp. 6 7,. [] C. J. Taylor, C. J. Taylor, D. J. Kriegman, and D. J. Kriegman, Minimization on the lie group so and related manifolds, Tech. Rep., 994. [] T. Gloe, A. Winkler, and K. Borowka, Efficient estimation and largescale evaluation of lateral chromatic aberration for digital image forensics, in SPIE Conference on Media Forensics and Security,. [] C.-K. Lin, Pixel grouping for color filter array demosaicing, http:// sites.google.com/site/chklin/demosaic, Apr. 3. [3] C.-T. Li, Source camera identification using enhanced sensor pattern noise, Information Forensics and Security, IEEE Transactions on, vol. 5, no., pp. 8 87, June. [4] X. Lin and C.-T. Li, Preprocessing reference sensor pattern noise via spectrum equalization, Information Forensics and Security, IEEE Transactions on, vol., no., pp. 6 4, 6. Copyright (c) 6 IEEE. Personal use is permitted. For any other purposes, permission must be obtained from the IEEE by emailing pubs-permissions@ieee.org.