ORIGINAL ARTICLE A COMPARATIVE STUDY OF QUALITY ANALYSIS ON VARIOUS IMAGE FORMATS 1 M.S.L.RATNAVATHI, 1 SYEDSHAMEEM, 2 P. KALEE PRASAD, 1 D. VENKATARATNAM 1 Department of ECE, K L University, Guntur 2 Doordarshan Kendra, Vijayawada ABSTRACT In the present scenario, with the advent and usage of electronic gadgets like IPODs, DVDs, and VCDs. its quite necessary to access the quality of picture. The original video/image is translated or transformed into several formats according to viewers or users choice. Correspondingly the video/image looses its originality and even may sometimes get completely destroyed.the current state of the art requires many compromises. Examples of these compromises are temporal resolution versus noise, spatial resolution versus image size, and luminance/color range versus gamut. These choices affect the quality of the reproduced images. To make optimal choices, it is necessary to have knowledge about how particular choices affect the impression of the viewer. The present paper gives a brief insight into the comparative study of image quality. 1
Keywords: Subjective quality analysis, Objective quality analysis, PSNR, SSIM. 1. INTRODUCTION Image quality is a characteristic of an image that measures the perceived image degradation (typically, compared to an ideal or perfect image). Imaging systems may introduce some amounts of distortion or artifacts in the signal, so the quality assessment is an important problem. Image quality is required because the digital images are subject to wide variety of distortions during transmission, acquisition, processing, compression, storage and reproduction any of which may result in degradation of visual quality of an image (1). 2. QUALITY ANALYSIS Image quality is a characteristic of an image that measures the perceived image degradation (6). It plays an important role in various image processingapplications. Goal of image quality assessment is to supply quality metrics that can predict perceived image quality automatically. Image quality analysis comprises of subjective and objective quality analysis. 2.1 Subjective Quality Analysis The best way to find quality of an image is to look at it because human eyes are the ultimate viewer. Subjective image quality is concerned with how image is perceived by a viewer and give his or her opinion on a particular image (4). The main subjective quality methods are Degradation Category Rating (DCR), Pair Comparison (PC) and Absolute Category Rating (ACR). The human subjects are shown two 2
sequences (original and processed) and are asked to assess the overall quality of the processed sequence with respect to the original (reference) sequence. The test is divided into multiple sessions and each session should not last more than 30 minutes. For every session, several dummy sequences are added, which are used to train the human subjects and are not included in the final score. The subjects score the processed image sequence on a scale (from 0 to 5 or 9) corresponding to their mental measure of the quality. This is termed as Mean Observer Score (MOS) (9). Table 1: Mean Observer score MOS Quality Impairment 5 Excellent Imperceptible 4 Good Perceptible but not annoying 3 Fair Slightly annoying 2 Poor Annoying 1 Bad Very annoying Four serious drawbacks of this approach are: The setup is difficult rooms need to be secure, displays calibrated, etc. Human subjects must be selected, screened and paid It is difficult to assess why one image was selected over another due to the Subjective nature of the test 3
Even though a wide variety of possible methods and test parameters can be considered, only a small fraction of the possible design decisions can be investigated due to the time consuming procedure. A. 2.2 Objective Quality Analysis Objective Metrics build models that describe the influences of several physical image characteristics on image quality, usually through a set of image attributes thought to determine image quality. When the influence of a set of design choices on physical image characteristics is known, then models can predict image quality. The models express video quality in terms of visible distortions, or artifacts introduced during the design process (4). Three methods exist for Objective Image Quality Testing: 1. Full Reference: Testing when the original and processed images are both present 2. No Reference: Testing when only the processed image is present 3. Reduced Reference: Testing when information about the original and processed imageis present, but not the actual video sequences. In general, Objective Metrics try to mimic the human visual system and return a score that indicates the MOS scores. Few of the Objective Metrics are listed below: PSNR Peak Signal to Noise Ratio JND Just Noticeable Differences SSIM Structural SIMilarity VQM Video Quality Metric Two serious drawbacks of this approach are: 4
These algorithms are measuring visible differences not image quality so if the processed image sequence is shifted up/down, the metric will show a difference, but the image quality is the same. The best algorithms correlate to human subjects about 75% of the time. The advantages of this approach are It is repeatable and a quantitative score can be generated. The Image Quality Testing tool price is the major cost Small differences are detected anywhere within the image 2.2.1 Peak Signal to Noise Ratio Peak signal to noise ratio is the ratio between the maximum possible power of a signal and the power of corrupting noise that affects the fidelity of its representation. Because many signals have a very wide dynamic range, PSNR is usually expressed in terms of the logarithmicdecibel scale. The PSNR is most commonly used as a measure of quality of reconstruction of lossy compression codecs (e.g., for image compression). The signal in this case is the original data, and the noise is the error introduced by compression. When comparing compression codecs it is used as an approximation to human perception of reconstruction quality, therefore in some cases one reconstruction may appear to be closer to the original than another, even though it has a lower PSNR (a higher PSNR would normally indicate that the reconstruction is of higher quality). One has to be extremely careful with the range of validity of this metric; it is only conclusively valid when it is used to compare results from the same codec (or codec type) and same content. 5
It is most easily defined via the mean squared error (MSE) which for two m n monochrome images I and K where one of the images is considered a noisy approximation of the other is defined as: MSE = mn 1 m 1 1 n [ ] 2 I( i, j) K( i, j) i= 0 j= 0 (1) The PSNR is defined as: 2 MAX I PSNR = 10log 10 = 20log MSE 10 MAXI MSE..(2) Here, MAX I is the maximum possible pixel value of the image. When the pixels are represented using 8 bits per sample, this is 255. More generally, when samples are represented using linear PCM with B bits per sample, MAX I is 2 B 1. For color images with three RGB values per pixel, the definition of PSNR is the same except the MSE is the sum over all squared value differences divided by image size and by three. Typical values for the PSNR in lossy image and video compression are between 30 and 50 db, where higher is better. Acceptable values for wireless transmission quality loss are considered to be about 20 db to 25 db. When the two images are identical the MSE will be equal to zero, resulting in an infinite PSNR (10). 2.2.2 Structural Similarities The structural similarity (SSIM) index is a method for measuring the similarity between two images. The SSIM index is a full reference metric, in other words, the measuring of image quality based on an initial uncompressed or distortion-free image as reference. SSIM is designed to improve on traditional methods like peak signal-to-noise ratio (PSNR) and mean square error 6
(MSE), which have proved to be inconsistent with human eye perception. The difference with respect to other techniques mentioned previously such as MSE or PSNR, is that these approaches estimate perceived errors on the other hand SSIM considers image degradation as perceived change in structural information(8). Structural information is the idea that the pixels have strong inter-dependencies especially when they are spatially close. These dependencies carry important information about the structure of the objects in the visual scene. The SSIM metric is calculated on various windows of an image. The measure between two windows x and y of common size N x N is: SSIM ( x, y) = ( 2µ xµ y + c1 )( 2σ xy + c2 ) 2 2 2 2 ( µ + µ + c )( σ + σ + c ) x y 1 x y 2.(3) In this equation µ x, µ y, σ x, σ y, σ xy are the estimates of the mean of x, mean of y, the variance of x, the variance of y and the covariance of x and y. C 1 = (k 1 L) 2 and C 2 =(k 2 L) 2 are constants where L is the dynamic range of the pixel-values and k 1 = 0.01 and k 2 = 0.03 by default (1). The value of SSIM is between -1 and 1 and gets the best value of 1 if xi = yi for all values of i. The quality index is applied to every image using a sliding window with 11 x11 circularsymmetric Gaussian weighting function for which the quality index is calculated and the total index of the image is the average of all the quality indexes of the image. 3. RESULTS Figure (1) below shows the original image and its distorted versions i.e., image added with salt and pepper noise of density 0.015, Gaussian noise with mean 0 and variance 0.01 and 7
speckle noise with variance 0.015 in jpeg format. The mean square error and the peak signal to noise ratio for the noisy versionss are calculated and tabulated for different image formats in table 2. Fig (1) : Original image, Image with salt& pepper noise, Gaussian noise and Speckle noise Table 2: MSE and PSNR values (in db) for different image formats and different noises Image Format Salt & Pepper noise Speckle noise Gaussian noise PSNR MSE PSNR MSE PSNR MSE Jpeg 22.4029 376.87 27.9167 105.88 20.96900 524.30 Tiff 22.4335 374.22 27.9107 106.02 20.96599 524.67 Png 22.4455 373.19 27.9208 105.78 20.9558 525.89 Bmp 22.3719 379.57 27.9202 105.79 20.9597 525.42 gif 22.8971 336.33 24.7343 220.32 20.67422 561.12 Similarly the structural similarity index is also calculated for different image formats and different noises and is tabulated in Table 3. 8
Table 3: Structural Similarity index values for different image formats and different noises Image Format Salt & Pepper noise Speckle noise Gaussian noise Jpeg 0.6 0.8873 0.4610 Tiff 0.6010 0.8866 0.4599 Png 0.5976 0.8861 0.4627 Bmp 0.6004 0.8858 0.4605 gif 0.7465 0.8302 0.6360 5. CONCLUSION The paper thus presents a perceptive approach of image quality analysis. It gives a clear description regarding the image quality and its analysis. The subjective video quality analysis and the objective video quality analysis are described in detail along with the various objective metrics as PSNR (Peak Signal to Noise Ratio),SSIM(Structural SIMilarity). In figure (2) where the original image is altered with different distortions, each adjusted to yield nearly identical PSNR (shown in table 2) relative to the original image. Despite this, the images can be seen to have drastically different perceptual quality. Hence the quality prediction performance of recently developed quality measure, such as the SSIM is quite competitive relative to the traditional quality measure. 6. REFERENCES [1] Z. Wang and A. C. Bovik, Image quality assessment: from error visibility to structural similarity, IEEE Trans. Image Processing, vol. 13, pp. 600 612, Apr. 2004. 9
[2] Z. Wang and A. C. Bovik, Modern image quality assessment, Morgan & Claypool Publishers, Jan. 2006. [3] H. R. Sheikh and A. C. Bovik, Image information and visual quality, IEEE Trans. Image Processing, vol. 15, pp. 430 444, Feb. 2006. [4] A. Stoica, C. Vertan, and C. Fernandez-Maloigne, Objective and subjective color image quality evaluation for JPEG 2000- compressed images, International Symposium on Signals, Circuits and Systems, 2003, vol. 1, pp. 137 140, July 2003. [5] A. M. Eskicioglu and P. S. Fisher, Image quality measure and their performance, IEEE signal processing letters, vol. 43, pp. 2959-2965, Dec. 1995 [6] http://www.cns.nyu.edu/~zwang/files/research/quality_index/demo_lena.html [7] http://live.ece.utexas.edu/research/quality/vif.htm [8] http://www.ece.uwaterloo.ca/~z70wang/research/ssim/ [9] http://en.wikipedia.org/wiki/mean_opinion_score [10] http://en.wikipedia.org/wiki/objective_video_quality 10