A POSTPROCESSING TECHNIQUE FOR COMPRESSION ARTIFACT REMOVAL IN IMAGES

Similar documents
Chapter 9 Image Compression Standards

Artifacts and Antiforensic Noise Removal in JPEG Compression Bismitha N 1 Anup Chandrahasan 2 Prof. Ramayan Pratap Singh 3

Practical Content-Adaptive Subsampling for Image and Video Compression

Compression and Image Formats

Objective Evaluation of Edge Blur and Ringing Artefacts: Application to JPEG and JPEG 2000 Image Codecs

2.1. General Purpose Run Length Encoding Relative Encoding Tokanization or Pattern Substitution

Analysis and Improvement of Image Quality in De-Blocked Images

Assistant Lecturer Sama S. Samaan

Analysis on Color Filter Array Image Compression Methods

Audio and Speech Compression Using DCT and DWT Techniques

A Modified Image Coder using HVS Characteristics

AN ERROR LIMITED AREA EFFICIENT TRUNCATED MULTIPLIER FOR IMAGE COMPRESSION

MLP for Adaptive Postprocessing Block-Coded Images

PERFORMANCE EVALUATION OFADVANCED LOSSLESS IMAGE COMPRESSION TECHNIQUES

Ch. 3: Image Compression Multimedia Systems

Audio Signal Compression using DCT and LPC Techniques

APPLICATIONS OF DSP OBJECTIVES

A COMPARATIVE ANALYSIS OF DCT AND DWT BASED FOR IMAGE COMPRESSION ON FPGA

Performance Evaluation of H.264 AVC Using CABAC Entropy Coding For Image Compression

A Compression Artifacts Reduction Method in Compressed Image

Efficient Image Compression Technique using JPEG2000 with Adaptive Threshold

Image Compression Using Huffman Coding Based On Histogram Information And Image Segmentation

Module 6 STILL IMAGE COMPRESSION STANDARDS

A Novel Approach of Compressing Images and Assessment on Quality with Scaling Factor

Comparative Analysis of Lossless Image Compression techniques SPHIT, JPEG-LS and Data Folding

Linear Gaussian Method to Detect Blurry Digital Images using SIFT

Improvement in DCT and DWT Image Compression Techniques Using Filters

Hybrid Coding (JPEG) Image Color Transform Preparation

SECTION I - CHAPTER 2 DIGITAL IMAGING PROCESSING CONCEPTS

Image Smoothening and Sharpening using Frequency Domain Filtering Technique

International Journal of Digital Application & Contemporary research Website: (Volume 1, Issue 7, February 2013)

Direction-Adaptive Partitioned Block Transform for Color Image Coding

Quality Assessment of Deblocked Images Changhoon Yim, Member, IEEE, and Alan Conrad Bovik, Fellow, IEEE

Lossless Huffman coding image compression implementation in spatial domain by using advanced enhancement techniques

The Scientist and Engineer's Guide to Digital Signal Processing By Steven W. Smith, Ph.D.

ScienceDirect. A Novel DWT based Image Securing Method using Steganography

Image compression using hybrid of DWT, DCT, DPCM and Huffman Coding Technique

Image Compression Using SVD ON Labview With Vision Module

UNEQUAL POWER ALLOCATION FOR JPEG TRANSMISSION OVER MIMO SYSTEMS. Muhammad F. Sabir, Robert W. Heath Jr. and Alan C. Bovik

A Robust Nonlinear Filtering Approach to Inverse Halftoning

2. REVIEW OF LITERATURE

A Modified Image Template for FELICS Algorithm for Lossless Image Compression

JPEG Image Transmission over Rayleigh Fading Channel with Unequal Error Protection

No-Reference Image Quality Assessment using Blur and Noise

A new quad-tree segmented image compression scheme using histogram analysis and pattern matching

IMAGE PROCESSING: AREA OPERATIONS (FILTERING)

Image Processing Computer Graphics I Lecture 20. Display Color Models Filters Dithering Image Compression

Image Compression Using Haar Wavelet Transform

Lossy and Lossless Compression using Various Algorithms

ABSTRACT I. INTRODUCTION

FPGA implementation of DWT for Audio Watermarking Application

Keywords: BPS, HOLs, MSE.

JPEG2000: IMAGE QUALITY METRICS INTRODUCTION

ISSN: Seema G Bhateja et al, International Journal of Computer Science & Communication Networks,Vol 1(3),

INTERNATIONAL JOURNAL OF PURE AND APPLIED RESEARCH IN ENGINEERING AND TECHNOLOGY

Comparing CSI and PCA in Amalgamation with JPEG for Spectral Image Compression

Discrete Wavelet Transform For Image Compression And Quality Assessment Of Compressed Images

An Efficient Color Image Segmentation using Edge Detection and Thresholding Methods

Image Quality Estimation of Tree Based DWT Digital Watermarks

SPIHT Algorithm with Huffman Encoding for Image Compression and Quality Improvement over MIMO OFDM Channel

IJSER. No Reference Perceptual Quality Assessment of Blocking Effect based on Image Compression

NON UNIFORM BACKGROUND REMOVAL FOR PARTICLE ANALYSIS BASED ON MORPHOLOGICAL STRUCTURING ELEMENT:

DENOISING DIGITAL IMAGE USING WAVELET TRANSFORM AND MEAN FILTERING

Keywords Fuzzy Logic, ANN, Histogram Equalization, Spatial Averaging, High Boost filtering, MSE, RMSE, SNR, PSNR.

Interpolation of CFA Color Images with Hybrid Image Denoising

Enhanced DCT Interpolation for better 2D Image Up-sampling

Detection and Verification of Missing Components in SMD using AOI Techniques

Implementation of Block based Mean and Median Filter for Removal of Salt and Pepper Noise

Module 8: Video Coding Basics Lecture 40: Need for video coding, Elements of information theory, Lossless coding. The Lecture Contains:

A.P in Bhai Maha Singh College of Engineering, Shri Muktsar Sahib

2518 IEEE TRANSACTIONS ON IMAGE PROCESSING, VOL. 18, NO. 11, NOVEMBER /$ IEEE

Blind Single-Image Super Resolution Reconstruction with Defocus Blur

PERFORMANCE ANALYSIS OF LINEAR AND NON LINEAR FILTERS FOR IMAGE DE NOISING

A JPEG CORNER ARTIFACT FROM DIRECTED ROUNDING OF DCT COEFFICIENTS. Shruti Agarwal and Hany Farid

Ch. Bhanuprakash 2 2 Asistant Professor, Mallareddy Engineering College, Hyderabad, A.P, INDIA. R.Jawaharlal 3, B.Sreenivas 4 3,4 Assocate Professor

Chapter 4 MASK Encryption: Results with Image Analysis

INTER-INTRA FRAME CODING IN MOTION PICTURE COMPENSATION USING NEW WAVELET BI-ORTHOGONAL COEFFICIENTS

PRIOR IMAGE JPEG-COMPRESSION DETECTION

[Srivastava* et al., 5(8): August, 2016] ISSN: IC Value: 3.00 Impact Factor: 4.116

Image Denoising Using Statistical and Non Statistical Method

An Analytical Study on Comparison of Different Image Compression Formats

Direction based Fuzzy filtering for Color Image Denoising

A Novel Approach for MRI Image De-noising and Resolution Enhancement

REVIEW OF IMAGE COMPRESSION TECHNIQUES FOR MULTIMEDIA IMAGES

COMPARITIVE STUDY OF IMAGE DENOISING ALGORITHMS IN MEDICAL AND SATELLITE IMAGES

A New Image Steganography Depending On Reference & LSB

IEEE Signal Processing Letters: SPL Distance-Reciprocal Distortion Measure for Binary Document Images

A Comparative Study and Analysis of Image Restoration Techniques Using Different Images Formats

Improvement of Classical Wavelet Network over ANN in Image Compression

Resolution Enhancement of Satellite Image Using DT-CWT and EPS

Robust Invisible QR Code Image Watermarking Algorithm in SWT Domain

Subjective evaluation of image color damage based on JPEG compression

Removal of ocular artifacts from EEG signals using adaptive threshold PCA and Wavelet transforms

Image compression using Thresholding Techniques

Implementation of Image Compression Using Haar and Daubechies Wavelets and Comparitive Study

Tri-mode dual level 3-D image compression over medical MRI images

Image Quality Assessment for Defocused Blur Images

Image Processing. Adrien Treuille

Satellite Image Compression using Discrete wavelet Transform

Anna University, Chennai B.E./B.TECH DEGREE EXAMINATION, MAY/JUNE 2013 Seventh Semester

Transcription:

A POSTPROCESSING TECHNIQUE FOR COMPRESSION ARTIFACT REMOVAL IN IMAGES Nirmal Kaur Department of Computer Science,Punjabi University Campus,Maur(Bathinda),India Corresponding e-mail:- kaurnirmal88@gmail.com Abstract- Image processing is used to process information according to limited bandwidth of transmission media and limited capacity of storage devices. Digital image and video are mostly coded using discrete cosine transform(dct) and discrete wavelet transform(dwt) which provide visual distortion called Artifact. Due to the huge data requirements for multimedia, it is required to get more compression with less visual defects. The visual artifacts in highly compressed images can be reduced by post processing technique. BDCT is adopted by widely used image/video compression standards such JPEG, MPEG-1, MPEG-2, MPEG-4, H.263 due to its high energy compression, low computational complexity and its ease of implementation. The popular international standard for compression, the Joint Photographers Expert Group (JPEG) can lead to noticeable discontinuities along the boundaries called the blocking artifacts. In this paper an analysis is provided for different type of Post-processing techniques and proposes a simple approach to reduce blocking artifact which occur during the compression in image processing. Keywords- Post processing, DCT, BDCT, Compression Artifact. I. INTRODUCTION The digital image application has become increasingly popular in every field of life. In rapidly growing technical environment images and video are very useful way for visual communication between human beings. Image processing is basically used to overcome different problems associated with images/videos like image restoration, image enhancement, segmentation, etc. Post processing techniques basically used with image restoration technique and image enhancement technique. Image processing procedures needed for fully automated and quantative analysis (registration, segmentation, visualization) require images with the signal to noise ratio and the least artifacts in order to improve their performances. Digital image and video are mostly coded using discrete cosine transform(dct) and discrete wavelet transform(dwt). When we have to transfer the data at low-bit rate these coding techniques have many visual distortions and imperfections called artifact occur. There are always some confusion between noise and artifact. According to my study difference is that noise may obscure features in an image, while artifacts appear to be features but are not. If the 'problem' is structured, it is probably an artifact, whereas if it is random, it is probably noise. Noise can be understood as a degradation of the image due to random occurrences which have no relation to the true object. Noise is randomly scattered throughout the image but most noticeable in the darker and shadow areas. Artifacts are sometimes also called jaggies, but this is more where the conversion to jpeg leaves the image with jagged edges, most commonly seen on diagonal lines and also in areas of colour. It's caused by the processing of the image, in camera or in post processing when the image is compressed to jpeg from the native file. Artifact is part of the contents of an image that does not have a counterpart in the physical object being imaged. My aim is to define the post processing technique for the reduction of compression artifacts and proposes a simple technique for blocking artifact reduction. This paper is organized as follow. In Section II presents an artifact classification. Section III describes the Postprocessing technique and overview of different Postprocessing methods for reducing the image and video artifacts. Section IV present purposed algorithm. Section V shows results and discussion. Finally, conclusion is given in Section VI. II. ARTIFACT Artifacts are the visible distortion that occurs at the different stages of the communication process i.e. the displayed image may differ from the original to the final recipient. According to their occurrence image and video artifacts are divided into four types that are: due to capture, processing, delivery and display [1]. Original Image Encoding using DCT Decoding using IDCT Distorted Image Figure 1: Image Processing Model for communication 12

In this paper main stress is on compression artifacts that are the result of processing stages. Basically image artifacts are the result of lossy compression. In image compression some data are discards that may be too complex to store in the available data-rate, the result of discarded data is compression artifact. The greatest technical limitation in the image and video process is the available bandwidth which affects the Compression ratio. If compression ratio is increases, the artifact will become more visible. TYPES OF ARTIFACT: A. RINGING ARTIFACT: Ringing artifact is caused by a course quantization of high frequency components. This appears as a ringing effect or sharp oscillations near a strong and sharp transition in a signal. All coding schemes that involve quantization in the frequency domain produce ringing effect. Ringing artifacts occurs often when DWT encoder is used. The ringing effect can spread out further in images that are compressed with non block based coder. Like blocking artifact that is more visible in DCT coding ringing artifacts are more noticeable artifact in wavelet coding. Ringing is less visible in JPEG compression. It is more visible at the low bit rates. Ringing issues are used for real time processing application. B. BLOCKING ARTIFACT: Images are often coded using BDCT or DWT. Blocking artifacts are most noticeable artifacts where we use block based DCT coding technique, in which image is divided into N*N blocks. BDCT is adopted by widely used image/video compression standards such JPEG, MPEG-1, MPEG-2, MPEG-4, H.263 due to its high energy compression, low computational complexity and its ease of implementation. After splitting an image into blocks of N*N, the blocks are independently DCT transformed, quantized, coded and transmitted. When the correlation between the two adjacency blocks are disturbed then there are horizontal and vertical borders appear in the image that is the blocking artifact. This is the most visible degradation of the block transform coding. Blocking artifacts are appearing in the images and videos at the block edges especially at curves and corners. We can eliminate the blocking artifacts in image by saving them using a lossless file format. The images that are created by ray-tracing programs have blockiness on the terrain. The blocking artifact is further divided into 3 categories such as: Staircase noise. Grid noise. Corner outliers. Staircase noise: staircase noise appears on the block edges; the edge is degraded such that the block bands looks like the edge. This one is especially happens with diagonal edges, these get a staircase like character. Grid noise: This is another form of artifact which occurs by using the higher bit rate applications in the decompressed data. Corner outliers: This type of artifact is visible at the corner points of blocks, where the corner point is either much larger or much smaller then neighboring pixels [2]. C. BLUR ARTIFACT: Absence of high frequencies in the low bit rate video results blurring artifact. Blurring means that the image is smoother than originally. Blurring effect produce due to the increased thickness of edges i.e. image detail becomes blurred. Blur and ringing are artifacts that result from quantization of coefficients in the frequency domain. The main difference between these two effects is that they appear on different sides i.e. horizontal and vertical. Blurring effect is decreased by increasing the viewing distance. Second difference in blurring is that the correlation between adjacent pixels in same row/column for LH (HL) orientation will increase but in ringing effect the correlation is reduced between adjacent pixel in the same row or column for LH (HL) orientation. D. COLOUR DISTORTION : As human eyes are not as sensitive to colour as to brightness, much of the detailed colour (chrominance) information is disposed, while luminance is retained. This process is called "chroma subsampling", and it means that a colour image is split into a brightness image and two colour images. The brightness (luma) image is stored at the original resolution, whereas the two colour (chroma) images are stored at a lower resolution. The compressed images look slightly washed-out, with less brilliant colour. E. TEXTURE DEVIATION : Another type of distortion is known as texture deviation, which is caused by the loss of fidelity in mid-frequency components and appears as granular noise. In transform coding. It is less visible to the human Visual system. However, in segmentation-or model based coding texture deviation often manifests itself as an over smoothing of texture patterns that can be visually annoying. III. REVIEW OF POSTPROCESSING TECHNIQUE FOR ARTIFACT REDUCTION A mostly highly compressed image 13

introduces a range of artifacts. Post-processing technique is used to remove these compression artifacts. This technique is implemented at the decoder end to reduce the compression artifacts and to improve the overall image quality for a given bit rate without increasing the bit rate or modifying the coding procedure. Encoding Original image DCT Pre-Processing Post-Processing Figure 2: Process of Transform Coding The Post-processing algorithm has an input a distorted image. To process that image optimally the algorithm needs the parameter that was used in the encoding process, the block size & quantization parameter (JPEG quality). [3] Compressed Information Decoder Quantization Entropy Coding Figure 3: Post-Processing Scheme for Artifact Reduction Post- Processing Steps for Artifact Detection Decoding Distorted Image IDCT Entropy Decoding Steps for Artifact Reduction Convert to original Decoder Format Enhanced compressed information We could divide the Post-processing technique into the following types: 1) Pixel-domain filtering/spatial temporal. 2) Motion compensated algorithms. 3) Algorithms that transform signal to Frequency domain. 4) Iterative approaches based on the theory of projections onto convex set (POCS). There is even wider range of post processing algorithms to reduce artifacts. Many approaches have been proposed in the literature aiming at alleviation of the artifacts in images and video. A. Pixel domain filtering/spatial temporal Algorithms: Denoising images can be achieved by a spatial averaging of nearby pixels. This method removes noise but creates blur. Henceforth, neighborhood filters, which perform an average of neighboring pixels, are used to remove the noise. Nicolas and Prima [4] define the Non-Local Means algorithm. Non-Local Mean algorithm removes the artifact while retaining all the meaningful image information. The redundancy and self similarity of the image helps the NLM algorithm. NLM algorithm based on the assumption that image contents are likely to repeat themselves within some neighborhood. Therefore, de-noising each pixel is achieved by averaging all pixels in its neighborhood. For improving the image and video quality, Steven and Choy [5] propose the algorithm that uses local Statistics of Transform coefficients. They investigated that pixel brightness diversity among blocks is greater than within Block, and border pixels are filtered by spatial algorithm. This approach reduces the blocking artifact and simultaneously introduces the additional blur to the image s edges. B. Frequency Domain Algorithms: Frequency algorithms transform image or video sequences of images to frequency domain and modify DCT or DWT coefficients. These approaches are very complex because image and video signal haves to be transformed from spatial to frequency domain and vice-versa. [6] Hoon Peak and Kim proposed an algorithm, in which two adjacent homogenous blocks from a block boundary found. For finding the high frequency components mainly due to the edge are examined by using two steps. first two adjacent homogenous blocks u h and v h from the block boundary are found, where the homogenous block is defined as the block in which no adjacent pixels difference is larger than the difference of block boundary. in second step we can examine the frequency components in the homogenous blocks by applying the DCT to u h and v h. H. R. Wu, T. Chen and B. Qiu [7] proposed the post-filter using the DCT coefficients of shifted blocks to deblock and preserve the details. For each block, its DC value and DC values of the surrounding eight neighbor blocks are exploited to predict low frequency AC coefficients. Those predicted AC coefficients allow inferring spatial characteristics of a block before quantization stage in the encoding system. Liu and Bovik [8] 14

proposed the fast and blind measurement of detection and reduction to the blocks in the DCT domain.in the algorithm, blocking artifacts are modeled as 2-D step functions. A fast DCT-domain algorithm extracts all parameters needed to detect the presence of, and estimate the amplitude of blocking artifacts, by exploiting several properties of the human vision system. Using the estimate of blockiness, a novel DCT-domain method is then developed which adaptively reduces detected blocking artifacts. C. Motion Compensate Algorithms: Motion compensated algorithms are used to reduce different types of artifacts. This technique is also used with spatial and frequency domain approaches for better results. D. BAGNI [9] presents a post-processing algorithm for low bit rate videoconferencing on ISDN lines which uses both motion estimation and motion compensation techniques. According to this algorithm the limited number of images per second that are transmitted during low bit rate video conferencing are post processed inside the terminal receiver for obtaining the increased frame rate of 25Hz or 30 Hz. The paper proposes by Lin and Tai [10] gives the optimality-preserving ability of motion estimation using integral projections, along with an efficient method for preparing the integral projections of all the candidate blocks in the search window and shows the performance gain over that solely using all separate pixels in the block. D. Algorithm Based on Theory of Projection onto Convex Set (POCS): The projection onto convex sets (POCS) based methods represent the prior information about the original image as convex sets and, by iterating projections onto these sets. Avideh Zakhor [11] proposed an iterative block reduction technique based on the theory of projection onto convex sets (POCS). In the POCS based techniques, the image is projected onto multiple sets, each set representing a constraint in image space. The two constraints Zakhor uses are the quantization constraint and the band-limitation constraint. Changhoon Yim and Nam Ik Cho [12] gives a new approach of non iterative Postprocessing methods based on POCS in DCT domain called D- POCS for complexity reduction. In D-POCS, the LPF (Low pass filter) is performed in the DCT domain. Hence IDCT and DCT modules would not be required. Yoon Kim, Chun-Su Park and Sung-Jea Ko [13] proposed a new smoothness constraint set and its projection operator based on the theory of POCS in the DCT domain, to reduce blocking artifacts in the BDCT coded images. The new Postprocessing technique is also fully operated in the frequency domain without performing the DCT/IDCT operation which saves computational time beiween 41% to 64%. IV. A PROPOSED TECHNIQUE FOR REDUCING BLOCKING ARTIFACT: This Proposed Algorithm used to remove discontinuity in the blocks. Discontinuities present in the DCT decompressed images. Followings are the steps in this algorithm: Step 1: We have to find a set of properties for each labeled region in the label matrix having discontinuity in the blocks. These discontinuities are reflected as AC component in the DCT of the new blocks. Step 2: Fill up these blocks as per the property of the nearby block. Step 3: Combining the obtained block with image matrices to form the original image Let b1, b2.bn be the blocks having the discontinuity with other blocks. Find the area in blocks having discontinuity avails, Create the set for the pixels. Fill the gap that is performing the discontinuity in the blocks. Make combination with the nearby pixel on the basis of color symmetry Create a discontinuity Set : If p1,p2 p3 set of pixels. Belongs to block b1, and similarly second set of block is belongs to the block b2. For each pixels for each blocks having the dissimilarity value at the end of block Mark these pixels with neighbor pixel value, till the block is get vanished.. V. RESULTS AND DISSCUSSION: In this paper a simple algorithm is proposed for reducing the image blocking artifacts. The proposed algorithm was implemented using MATLAB software. Different 6 images have been used as reference images to check the image quality after artifact reduction. Different samples of images have been used to compare with the existing Corner Outlier algorithm and results obtained are shown in Table 1.I use PSNR and MSE as quality metric to compare this algorithm with the outlier algorithm. From the Table 1 it is clear that with this purposed algorithm we gets better results. This algorithm mainly focused on the neighborhood search for removing the discontinuity appears at the block border. This algorithm can highly preserve the high frequency components while smoothing out the boundary discontinuity. It helps us in better dealing with removing the blocking artifacts that has been achieved by observing the PSNR trend, it is also effective at very low bit rate. Thus one can reduce 15

the image compression blocking artifacts using this proposed algorithm. whereas PSNR is a measure of the peak error. The mathematical formulae for the two are Referenced images used for quality measurement are shown below: Figure 4: The original uncompressed test images. (a) Lena (b) Cameraman (c) Baboon (d) Eye (e) Level10 (f) Car Where I(x,y) is the original image, I'(x,y) is the approximated version (which is actually the decompressed image) and M,N are the dimensions of the images. A lower value for MSE means lesser error, and as seen from the inverse relation between the MSE and PSNR, this translates to a high value of PSNR. Logically, a higher value of PSNR is good because it means that the ratio of Signal to Noise is higher. Here, the 'signal' is the original image, and the 'noise' is the error in reconstruction. So, if you find an artifact reduction scheme having a lower MSE (and a high PSNR), you can recognize that it is a better one. Sr. Test Images Outlier Purposed No. filtered Algorithm (a) (b) PSNR MSE PSNR MSE 1 Lena 35.14 4.46 48.13 1 2 Cameraman 37.581 3.37 48.14 0.99 3 Baboon 34.09 5.03 48.13 1 4 Eye 31.22 7.01 48.13 0.99 5 Level10 37.99 3.21 49.22 0.88 6 Car 37.46 3.41 48.14 0.99 Table 1 Comparison of proposed work and outlier algorithm on different test images on the bases of PSNR and MSE Metrics. (c) (d) Figure 5: (a) is the region around the shoulder of original Lena image, while (b) one is the compressed output (c) and (d) Show the result of artifact reduction algorithms. (c) is corner outlier filtered and (d) is result of proposed algorithm. Here two of the error metrics Mean Square Error (MSE) and the Peak Signal to Noise Ratio (PSNR) are used to compare the purposed research work and the corner outlier technique of artifact reduction. The MSE is the cumulative squared error between the compressed and the original image, Fig 6 PSNR and MSE Value Graph For Proposed and Corner Outlier algorithm 16

V. CONCLUSION: In this paper a Post-Processing technique is proposed to alleviate the blocking artifact in decoded images. In this approach, Proposed Algorithm used to remove discontinuity in the blocks discontinuities present in the DCT decompressed images. We have to measures a set of properties for each labeled region in the label matrix having discontinuity in the blocks are found. These discontinuities are reflected as AC component in the DCT of the new blocks. By Filling up these blocks as per the property of the nearby block we can reduce the blocking effect at boundaries. This research work provides an accurate, simple analysis of artifact reduction. From this research it is concluded that this algorithm gives better result than the corner outlier. These algorithms provide high value of PSNR and low value of MSE as compare to the outlier. Compression"IEEE Transactions On Communications, Vol. 45, No. 5, May 1997 [11]. Zakhor, A., "Iterative procedures for reduction of blocking effects in transform image coding", Circuits and Systems for Video Technology, IEEE Transactions on, Volume: 2, Issue: 1, March 1992, Pages: 91 95 [12]. Changhoon Yim AND Nam Ik Cho,"Blocking Artifact Reduction Method Based on Non-iterative POCS in the DCT Domain" 0-7803-9134-9/05 2005 IEEE [13]. Yoon Kim, Chun-Su Park and Sung-Jea Ko,"Frequency domain post-processing technique based on POCS", ELECTRONICS LETTERS 30th October 2003 Vol. 39 No. 22. REFERENCES: [1]. Amal Punchihewa and Donald G. Bailley, Atrifact in Image and Video system: Classification and Mitigation. [2]. Anudeep Gandam and Jagroop Singh Sidhu, A post Processing Algorithm for detection and removal of corner outlier, International journal of Computer Applications(0975-8887), Volume 4- No. 2, July 2010. [3]. Mei-Yin Shen and C.-C. Jay Kuo, Review of Postprocessing Technoques for Compression Artifact Removal, Journal of Visual Communication and Image Representation, Vol. 9, No. 1, March 1998. [4]. Nicolas Weist-Daessels, Sylvain Prima, Pierrick Coupe, Sean Patrick, Morrissey and Christian Barillot, Non-Local means variants for Denoising of diffusion-wieghted and diffusion tensor MRI. [5]. S.S.O. Steven S.O. Choy, Y. H. Chan, Reduction of Block- Transfer Image Coding Artifacts by Using Local Statistics of Transform Coefficients, 1997 IEEE signal processing letters, Vol. 4, Jan 1997. [6]. Hoon Paek, Rin-Chul Kim, and Sang-Uk Lee, "On the POCSbased postprocessing technique to reduce the blocking artifacts in transform coded images", Circuits and Systems for Video Technology, IEEE Transactions on, Volume: 8, Issue: 3, June 1998, Pages: 358 367 [7]. H. R. Wu, T. Chen and B. Qiu, Adaptive Post filtering of transform coefficients for the reduction of blocking artifacts, IEEE Trans. Circuits and Systems for Vide Technology, vol. 11, pp. 594 602, May 2001. [8]. S. Liu and A. C. Bovik, Efficient dct-domain blind measurement and reduction of blocking artifacts, IEEE Trans. Circuits and Systems for Vide Technology, vol. 12, no.12, December 2002. [9]. D. Bagni, G. De Haan, V. Riva, Motion Compensated PostProcessinbg for Low Bit-Rate Video Conferencing On ISDN Lines. [10]. Yih-Chuan Lin and Shen-Chuan Tai, "Fast Full-Search Block-Matching Algorithm for Motion-Compensated Video 17