Topic-Guided Abstractive Text Summarization: a Joint Learning Approach


الملخص بالإنكليزية

We introduce a new approach for abstractive text summarization, Topic-Guided Abstractive Summarization, which calibrates long-range dependencies from topic-level features with globally salient content. The idea is to incorporate neural topic modeling with a Transformer-based sequence-to-sequence (seq2seq) model in a joint learning framework. This design can learn and preserve the global semantics of the document, which can provide additional contextual guidance for capturing important ideas of the document, thereby enhancing the generation of summary. We conduct extensive experiments on two datasets and the results show that our proposed model outperforms many extractive and abstractive systems in terms of both ROUGE measurements and human evaluation. Our code is available at: https://github.com/chz816/tas.

تحميل البحث