تلقى تلخيص محادثة الجماع اهتماما كبيرا مؤخرا.ومع ذلك، غالبا ما تعاني هذه الملخصات التي تم إنشاؤها من محتوى غير كاف أو زائد أو غير صحيح، ويعزى ذلك إلى حد كبير إلى الخصائص غير المنظمة والمعقدة للتفاعلات البشرية البشرية.تحقيقا لهذه الغاية، نقترح نموذجا صراحة الهياكل الغنية في محادثات للحصول على تلخيص محادثة أكثر دقة ودقيقة، من خلال إدراج علاقات الخطاب الأولى بين الكلام والأربع من الليئات (WHO - ما ") في كلام من خلال الرسوم البيانية المنظمة لتشفير المحادثات بشكل أفضلثم تصميم وحدة فك ترميز متعددة الحبيبات لتوليد ملخصات من خلال الجمع بين جميع مستويات المعلومات.تشير التجارب إلى أن نماذجنا المقترحة تفوقت على الطرق الحديثة والتعميم بشكل جيد في المجالات الأخرى من حيث التقييمات التلقائية والأحكام البشرية.لقد أصدرنا علنا رمزنا في https://github.com/gt-salt/sulture-aware-bart.
Abstractive conversation summarization has received much attention recently. However, these generated summaries often suffer from insufficient, redundant, or incorrect content, largely due to the unstructured and complex characteristics of human-human interactions. To this end, we propose to explicitly model the rich structures in conversations for more precise and accurate conversation summarization, by first incorporating discourse relations between utterances and action triples (who-doing-what'') in utterances through structured graphs to better encode conversations, and then designing a multi-granularity decoder to generate summaries by combining all levels of information. Experiments show that our proposed models outperform state-of-the-art methods and generalize well in other domains in terms of both automatic evaluations and human judgments. We have publicly released our code at https://github.com/GT-SALT/Structure-Aware-BART.
References used
https://aclanthology.org/
We present Graformer, a novel Transformer-based encoder-decoder architecture for graph-to-text generation. With our novel graph self-attention, the encoding of a node relies on all nodes in the input graph - not only direct neighbors - facilitating t
We suggest to model human-annotated Word Usage Graphs capturing fine-grained semantic proximity distinctions between word uses with a Bayesian formulation of the Weighted Stochastic Block Model, a generative model for random graphs popular in biology
Unlike well-structured text, such as news reports and encyclopedia articles, dialogue content often comes from two or more interlocutors, exchanging information with each other. In such a scenario, the topic of a conversation can vary upon progressio
Knowledge graph entity typing aims to infer entities' missing types in knowledge graphs which is an important but under-explored issue. This paper proposes a novel method for this task by utilizing entities' contextual information. Specifically, we d
The problem of answering questions using knowledge from pre-trained language models (LMs) and knowledge graphs (KGs) presents two challenges: given a QA context (question and answer choice), methods need to (i) identify relevant knowledge from large