تحسن تبسيط النص قابلية قراءة الجمل من خلال العديد من تحويلات إعادة كتابة، مثل إعادة الصياغة المعجمية والحذف والتقشير. تعتبر أنظمة التبسيط الحالية في الغالب نماذج تسلسل التسلسل التي يتم تدريبها على نهاية إلى نهاية لأداء كل هذه العمليات في وقت واحد. ومع ذلك، فإن هذه الأنظمة تحد من نفسها لحذف الكلمات ويمكنها بسهولة التكيف مع متطلبات الجماهير المستهدفة المختلفة. في هذه الورقة، نقترح نهجا مختلطا هجينا رواية يرفع القواعد ذات الدوافع اللغوية لتقسيم وحذفها، والأزواج مع نموذج إعادة الصياغة العصبية لإنتاج أنماط إعادة كتابة متنوعة. نقدم طريقة جديدة لتعزيز البيانات لتحسين القدرة على إعادة صياغة نموذجنا. من خلال التقييمات التلقائية والدليلية، نوضح أن نموذجنا المقترح يحدد حالة جديدة من بين المهمة، أو إعادة صياغة أكثر من النظم الحالية، ويمكن أن تتحكم في درجة كل عملية تبسيط مطبقة على نصوص الإدخال.
Text Simplification improves the readability of sentences through several rewriting transformations, such as lexical paraphrasing, deletion, and splitting. Current simplification systems are predominantly sequence-to-sequence models that are trained end-to-end to perform all these operations simultaneously. However, such systems limit themselves to mostly deleting words and cannot easily adapt to the requirements of different target audiences. In this paper, we propose a novel hybrid approach that leverages linguistically-motivated rules for splitting and deletion, and couples them with a neural paraphrasing model to produce varied rewriting styles. We introduce a new data augmentation method to improve the paraphrasing capability of our model. Through automatic and manual evaluations, we show that our proposed model establishes a new state-of-the-art for the task, paraphrasing more often than the existing systems, and can control the degree of each simplification operation applied to the input texts.
References used
https://aclanthology.org/
Recently, a large pre-trained language model called T5 (A Unified Text-to-Text Transfer Transformer) has achieved state-of-the-art performance in many NLP tasks. However, no study has been found using this pre-trained model on Text Simplification. Th
We present a novel technique for zero-shot paraphrase generation. The key contribution is an end-to-end multilingual paraphrasing model that is trained using translated parallel corpora to generate paraphrases into meaning spaces'' -- replacing the f
In this paper, we propose a controllable neural generation framework that can flexibly guide dialogue summarization with personal named entity planning. The conditional sequences are modulated to decide what types of information or what perspective t
The quality of fully automated text simplification systems is not good enough for use in real-world settings; instead, human simplifications are used. In this paper, we examine how to improve the cost and quality of human simplifications by leveragin
Recent work on opinion summarization produces general summaries based on a set of input reviews and the popularity of opinions expressed in them. In this paper, we propose an approach that allows the generation of customized summaries based on aspect