Article

Blur Resistant Image Authentication Method with Pixel-wise Tamper Localization

Authors:
To read the full-text of this research, you can request a copy directly from the author.

Abstract

A new digital signature based method for image authentication and pixel-wise tamper localization is proposed. The proposed method is resistant to image content preserving enhancement operations such as blurring or sharpening. Authentication function is implemented separately from tamper localization both to increase processing speed and to disable algorithmic oracle attack.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the author.

Article
Full-text available
In this paper robust image authentication integrated with semi-fragile pixel-wise tamper localization is analyzed. A new pixel-wise transformation robust to blurring/sharpening while fragile to all other image processing operations is proposed. A new method featuring binary and percentage measures with novel ability to integrate human opinion for image authenticity evaluation is presented. Protection for all bits in the pixel is an advantage as well as the small size of the signature, namely less than 10% of the initial image.
Article
Full-text available
One of the most common image features used in machine vision are edges, and there is a substantial body of research oil various techniques for performing edge detection. Edges are useful in many applications as image comparing, recognition and other. Here is presented edge detection method with subpixel accuracy. Method is based on decision that different intensity and size areas influence pixel brightness with some relation function. Hear presented functions to calculate one dot of edge going through the pixel. Test results show that with 0.01 standard deviation is estimated 47% of dots, with 0.05 standard deviation is estimated 88% of dots and 94% with 0.06 standard deviation. Also it is defined, that linearity decrease is more than 5% when edge Cut triangle which area is less then 10% of pixel area. III. 7, bibl. 6 (in English; summaries in English, Russian and Lithuanian).
Article
Measurement of visual quality is of fundamental importance to numerous image and video processing applications. The goal of quality assessment (QA) research is to design algorithms that can automatically assess the quality of images or videos in a perceptually consistent manner. Image QA algorithms generally interpret image quality as fidelity or similarity with a "reference" or "perfect" image in some perceptual space. Such "full-reference" QA methods attempt to achieve consistency in quality prediction by modeling salient physiological and psychovisual features of the human visual system (HVS), or by signal fidelity measures. In this paper, we approach the image QA problem as an information fidelity problem. Specifically, we propose to quantify the loss of image information to the distortion process and explore the relationship between image information and visual quality. QA systems are invariably involved with judging the visual quality of "natural" images and videos that are meant for "human consumption." Researchers have developed sophisticated models to capture the statistics of such natural signals. Using these models, we previously presented an information fidelity criterion for image QA that related image quality with the amount of information shared between a reference and a distorted image. In this paper, we propose an image information measure that quantifies the information that is present in the reference image and how much of this reference information can be extracted from the distorted image. Combining these two quantities, we propose a visual information fidelity measure for image QA. We validate the performance of our algorithm with an extensive subjective study involving 779 images and show that our method outperforms recent state-of-the-art image QA algorithms by a sizeable margin in our simulations. The code and the data from the subjective study are available at the LIVE website.