في هذه الورقة، ندرس تلخيص الجملة المبادرة.هناك ميزان معلومات أساسية يمكن أن تؤثر على جودة تلخيص الأخبار، والتي هي الكلمات الرئيسية للموضوع والهيكل المعرفي لنص الأخبار.علاوة على ذلك، فإن تشفير المعرفة الموجودة لديها أداء ضعيف في هيكل المعرفة بالقضاء السريع.بالنظر إلى هذه، نقترح KAS، ومعرفة رواية وتحويل الكلمات الرئيسية المعزز بإطار تلخيص الجملة المبادرة.يتم استخدام Tri-Encoders لإدماج سياقات النص الأصلي وهيكل المعرفة وموضوع الكلمات الرئيسية في وقت واحد، مع بنية معرفة خطية خاصة.التقييمات التلقائية والبشرية تثبت أن KAS تحقق أفضل العروض.
In this paper, we study the abstractive sentence summarization. There are two essential information features that can influence the quality of news summarization, which are topic keywords and the knowledge structure of the news text. Besides, the existing knowledge encoder has poor performance on sparse sentence knowledge structure. Considering these, we propose KAS, a novel Knowledge and Keywords Augmented Abstractive Sentence Summarization framework. Tri-encoders are utilized to integrate contexts of original text, knowledge structure and keywords topic simultaneously, with a special linearized knowledge structure. Automatic and human evaluations demonstrate that KAS achieves the best performances.
References used
https://aclanthology.org/
Repetition in natural language generation reduces the informativeness of text and makes it less appealing. Various techniques have been proposed to alleviate it. In this work, we explore and propose techniques to reduce repetition in abstractive summ
In this paper we present TeMoTopic, a visualization component for temporal exploration of topics in text corpora. TeMoTopic uses the temporal mosaic metaphor to present topics as a timeline of stacked bars along with related keywords for each topic.
Despite significant progress in neural abstractive summarization, recent studies have shown that the current models are prone to generating summaries that are unfaithful to the original context. To address the issue, we study contrast candidate gener
We study generating abstractive summaries that are faithful and factually consistent with the given articles. A novel contrastive learning formulation is presented, which leverages both reference summaries, as positive training data, and automaticall
Large scale pretrained models have demonstrated strong performances on several natural language generation and understanding benchmarks. However, introducing commonsense into them to generate more realistic text remains a challenge. Inspired from pre