Blocking artifact detection and reduction in compressed data

Inf. Process. Lab., Aristotle Univ. of Thessaloniki
IEEE Transactions on Circuits and Systems for Video Technology (Impact Factor: 2.62). 11/2002; 12(10):877 - 890. DOI: 10.1109/TCSVT.2002.804880
Source: DBLP


A novel frequency-domain technique for image blocking artifact detection and reduction is presented. The algorithm first detects the regions of the image which present visible blocking artifacts. This detection is performed in the frequency domain and uses the estimated relative quantization error calculated when the discrete cosine transform (DCT) coefficients are modeled by a Laplacian probability function. Then, for each block affected by blocking artifacts, its DC and AC coefficients are recalculated for artifact reduction. To achieve this, a closed-form representation of the optimal correction of the DCT coefficients is produced by minimizing a novel enhanced form of the mean squared difference of slope for every frequency separately. This correction of each DCT coefficient depends on the eight neighboring coefficients in the subband-like representation of the DCT transform and is constrained by the quantization upper and lower bound. Experimental results illustrating the performance of the proposed method are presented and evaluated.

Download full-text


Available from: George A. Triantafyllidis, Sep 10, 2013
  • Source
    • "They then took the ratio of number of coefficients between the two ranges as a distinguished feature to determine whether a given bitmap has been previously JPEG compressed. There are also several works [2] [3] that could be extended to identify decompressed bitmaps. However, the performance of these works is far from satisfactory, as reported in [12]. "
    [Show abstract] [Hide abstract]
    ABSTRACT: Estimation of JPEG compression history for bitmaps has drawn increasing attention in the past decade due to its extensive applications in image processing, image forensics and steganalysis. In this paper, we propose a novel statistic named factor histogram for estimating the JPEG compression history of bitmaps. In a statistical sense, the factor histogram decreases with the increase of its bin index for uncompressed bitmaps. Whereas, it exhibits a local maximum at the bin index corresponding to the quantization step for JPEG decompressed bitmaps, which makes itself no longer decrease. Based on these characteristics, we propose to identify decompressed bitmaps by measuring the monotonicity of factor histogram, and to estimate the quantization step of each frequency by locating the bin index of the local maximum in factor histogram. Experimental results demonstrate that the proposed method outperforms the existing methods for a range of image sizes, meanwhile maintaining low computational cost.
    Full-text · Article · Apr 2015 · Digital Signal Processing
  • Source
    • "Because the blocking artefacts results in the abrupt changes of pixels across boundaries, a specific type of metric is required to measure such distortions. In addition to PSNR and SSIM, the M SDS t [15] (calculated on 16x16 block) was used to measure changes of intensity gradient of the pixels close to boundary of two blocks. For coarsely quantized blocks, a difference in the intensity gradient across the block boundary is expected to be larger than blocks which are not coarsely quantized. "
    [Show abstract] [Hide abstract]
    ABSTRACT: This paper proposes a novel approach to video deblocking which performs perceptually adaptive bilateral filtering by considering color, intensity, and motion features in a holistic manner. The method is based on bilateral filter which is an effective smoothing filter that preserves edges. The bilateral filter parameters are adaptive and avoid over-blurring of texture regions and at the same time eliminate blocking artefacts in the smooth region and areas of slow motion content. This is achieved by using a saliency map to control the strength of the filter for each individual point in the image based on its perceptual importance. The experimental results demonstrate that the proposed algorithm is effective in deblocking highly compressed video sequences and to avoid over-blurring of edges and textures in salient regions of image.
    Full-text · Conference Paper · Jul 2014
  • Source
    • "However, these methods usually investigate the JPEG images coded at low bit rates and aim to reduce the blocking artifacts between the block boundaries and ringing effect around the content edges due to the lossy JPEG compression. Some of them, e.g., [17]–[22], may be extended to identify JPEG images, however, the performances are very poor based on our experiments as shown in Section III-A. We also note that there are several reported methods, e.g., [6], [7], and [26]–[30], for other forensics/steganlysis issues which tried to identify the double JPEG compressed images and/or further estimate the primary quantization table. "
    [Show abstract] [Hide abstract]
    ABSTRACT: JPEG is one of the most extensively used image formats. Understanding the inherent characteristics of JPEG may play a useful role in digital image forensics. In this paper, we introduce JPEG error analysis to the study of image forensics. The main errors of JPEG include quantization, rounding, and truncation errors. Through theoretically analyzing the effects of these errors on single and double JPEG compression, we have developed three novel schemes for image forensics including identifying whether a bitmap image has previously been JPEG compressed, estimating the quantization steps of a JPEG image, and detecting the quantization table of a JPEG image. Extensive experimental results show that our new methods significantly outperform existing techniques especially for the images of small sizes. We also show that the new method can reliably detect JPEG image blocks which are as small as 8 × 8 pixels and compressed with quality factors as high as 98. This performance is important for analyzing and locating small tampered regions within a composite image.
    Preview · Article · Oct 2010 · IEEE Transactions on Information Forensics and Security
Show more