Figure 2 - available via license: Creative Commons Zero 1.0
Content may be subject to copyright.
Source publication
In this work, we study abstractive text summarization by exploring different models such as LSTM-encoder-decoder with attention, pointer-generator networks, coverage mechanisms, and transformers. Upon extensive and careful hyperparameter tuning we compare the proposed architectures against each other for the abstractive text summarization task. Fin...
Context in source publication
Context 1
... Mechanism: Pointer-generator is a hybrid network that chooses during training and test whether to copy words from the source via pointing or to generate words from a fixed vocabulary set. Figure 2 shows the architecture for the pointer-generator mechanism where the decoder part is modified compared to Figure 1. In Figure 1, the baseline model, only an attention distribution and a vocabulary distribution are calculated. ...
Similar publications
Text summarying is a process by which the most important information from the source document is precisely found. It stands for the information condensed to a longer text. Text summary is broken down into two approaches: extractive summary and abstractive summary. The proposed method creates an extractive summary of a given text and generate an app...