Ye In Park’s scientific contributions

What is this page?


This page lists works of an author who doesn't have a ResearchGate profile or hasn't added the works to their profile yet. It is automatically generated from public (personal) data to further our legitimate goal of comprehensive and accurate scientific recordkeeping. If you are this author and want this page removed, please let us know.

Publications (3)


Attention-based Bidirectional LSTM-CNN Model for Remaining Useful Life Estimation
  • Conference Paper

May 2021

·

67 Reads

·

22 Citations

Jou Won Song

·

Ye In Park

·

Jong-Ju Hong

·

[...]

·


Figure 2. Overall network structure of the proposed method. The network consists of two sub-networks: the attention network and ADGAN.
Figure 3. Our hard augmentation-based anomaly generation. Two types of synthetic anomaly data xAno is generated by the several hard augmentations. x
Figure 4. Qualitative results on the CIFAR10 dataset. (a) normal input images and attention maps. (b), (c) anomaly input images and attention maps. The proposed method removed the region of the anomaly class. Therefore, the attention map generated by the attention network differs from the attention map of a normal image.
Figure 7. Output images reconstructed from the attention network and the attention map. The normal class of CIFAR10 is airplane.
Attention Map-guided Two-stage Anomaly Detection using Hard Augmentation
  • Preprint
  • File available

March 2021

·

106 Reads

·

1 Citation

Anomaly detection is a task that recognizes whether an input sample is included in the distribution of a target normal class or an anomaly class. Conventional generative adversarial network (GAN)-based methods utilize an entire image including foreground and background as an input. However, in these methods, a useless region unrelated to the normal class (e.g., unrelated background) is learned as normal class distribution, thereby leading to false detection. To alleviate this problem, this paper proposes a novel two-stage network consisting of an attention network and an anomaly detection GAN (ADGAN). The attention network generates an attention map that can indicate the region representing the normal class distribution. To generate an accurate attention map, we propose the attention loss and the adversarial anomaly loss based on synthetic anomaly samples generated from hard augmentation. By applying the attention map to an image feature map, ADGAN learns the normal class distribution from which the useless region is removed, and it is possible to greatly reduce the problem difficulty of the anomaly detection task. Additionally, the estimated attention map can be used for anomaly segmentation because it can distinguish between normal and anomaly regions. As a result, the proposed method outperforms the state-of-the-art anomaly detection and anomaly segmentation methods for widely used datasets.

Download

Citations (2)


... Muneer et al [26] proposed a new attention-based DCNN architecture to achieve the RUL prediction. Song et al [27] proposed a novel RUL estimation method based on attention mechanism to capture the long-term dependency in the sequence data. Zhang et al [28] proposed a method called dual aspect self-attention based on transformer (DAST), which offers enhanced processing of long data sequences by using the encoder-decoder structure. ...

Reference:

A new dual-channel transformer-based network for remaining useful life prediction
Attention-based Bidirectional LSTM-CNN Model for Remaining Useful Life Estimation
  • Citing Conference Paper
  • May 2021

... Among these GANbased anomaly detection methods, the ways in which the output of the discriminator is used to distinguish the two types of data can also be roughly classified into two categories. One is used by models such as [31], which directly uses the discriminator to classify the input data, and the final output is the label. The other is used by models such as [2,35], which propose the concept of anomaly score. ...

Attention Map-guided Two-stage Anomaly Detection using Hard Augmentation