May 2021
·
67 Reads
·
22 Citations
This page lists works of an author who doesn't have a ResearchGate profile or hasn't added the works to their profile yet. It is automatically generated from public (personal) data to further our legitimate goal of comprehensive and accurate scientific recordkeeping. If you are this author and want this page removed, please let us know.
May 2021
·
67 Reads
·
22 Citations
March 2021
·
106 Reads
·
1 Citation
Anomaly detection is a task that recognizes whether an input sample is included in the distribution of a target normal class or an anomaly class. Conventional generative adversarial network (GAN)-based methods utilize an entire image including foreground and background as an input. However, in these methods, a useless region unrelated to the normal class (e.g., unrelated background) is learned as normal class distribution, thereby leading to false detection. To alleviate this problem, this paper proposes a novel two-stage network consisting of an attention network and an anomaly detection GAN (ADGAN). The attention network generates an attention map that can indicate the region representing the normal class distribution. To generate an accurate attention map, we propose the attention loss and the adversarial anomaly loss based on synthetic anomaly samples generated from hard augmentation. By applying the attention map to an image feature map, ADGAN learns the normal class distribution from which the useless region is removed, and it is possible to greatly reduce the problem difficulty of the anomaly detection task. Additionally, the estimated attention map can be used for anomaly segmentation because it can distinguish between normal and anomaly regions. As a result, the proposed method outperforms the state-of-the-art anomaly detection and anomaly segmentation methods for widely used datasets.
October 2020
·
5 Reads
... Muneer et al [26] proposed a new attention-based DCNN architecture to achieve the RUL prediction. Song et al [27] proposed a novel RUL estimation method based on attention mechanism to capture the long-term dependency in the sequence data. Zhang et al [28] proposed a method called dual aspect self-attention based on transformer (DAST), which offers enhanced processing of long data sequences by using the encoder-decoder structure. ...
May 2021
... Among these GANbased anomaly detection methods, the ways in which the output of the discriminator is used to distinguish the two types of data can also be roughly classified into two categories. One is used by models such as [31], which directly uses the discriminator to classify the input data, and the final output is the label. The other is used by models such as [2,35], which propose the concept of anomaly score. ...
March 2021