Angela Xu’s scientific contributions

What is this page?


This page lists works of an author who doesn't have a ResearchGate profile or hasn't added the works to their profile yet. It is automatically generated from public (personal) data to further our legitimate goal of comprehensive and accurate scientific recordkeeping. If you are this author and want this page removed, please let us know.

Publications (1)


Figure 1: Baseline sequence-to-sequence model's architecture with attention [See et al., 2017]
Figure 2: Pointer-generator model's architecture [See et al., 2017]
Figure 3: (a). The Transformer -model architecture, (b). (left) Scaled Dot-Product Attention. (right) Multi-Head Attention consists of several attention layers running in parallel -[Miyagishima et al., 2014]
Neural Abstractive Text Summarization and Fake News Detection
  • Preprint
  • File available

March 2019

·

2,136 Reads

·

Gao Xian Peh

·

Angela Xu

In this work, we study abstractive text summarization by exploring different models such as LSTM-encoder-decoder with attention, pointer-generator networks, coverage mechanisms, and transformers. Upon extensive and careful hyperparameter tuning we compare the proposed architectures against each other for the abstractive text summarization task. Finally, as an extension of our work, we apply our text summarization model as a feature extractor for a fake news detection task where the news articles prior to classification will be summarized and the results are compared against the classification using only the original news text. keywords: abstractive text summarization, pointer-generator, coverage mechanism, transformers, fake news detection

Download