توليد الحوار المكيف يعاني من ندرة الردود المسمى.في هذا العمل، استغلالنا بيانات نصية غير حوار مرتبطة بالشرط، والتي هي أسهل بكثير لجمعها.نقترح نهج تعليمي متعدد المهام للاستفادة من كل من الحوار والبيانات النصية المسمى.تقوم المهام الثلاثة بتحسين نفس مهمة توليد الحوار المحول مدببت مسبقا على بيانات الحوار المسمى، ومهمة ترميز اللغة مشروطة ومهمة توليد اللغة مشروطة على البيانات النصية المسمى.تظهر النتائج التجريبية أن نهجنا يتفوق على النماذج الحديثة من خلال الاستفادة من النصوص المسمى، كما أنه يحصل أيضا على تحسين أكبر في الأداء مقارنة بالطرق السابقة لاستفادة البيانات النصية.
Conditioned dialogue generation suffers from the scarcity of labeled responses. In this work, we exploit labeled non-dialogue text data related to the condition, which are much easier to collect. We propose a multi-task learning approach to leverage both labeled dialogue and text data. The 3 tasks jointly optimize the same pre-trained Transformer -- conditioned dialogue generation task on the labeled dialogue data, conditioned language encoding task and conditioned language generation task on the labeled text data. Experimental results show that our approach outperforms the state-of-the-art models by leveraging the labeled texts, and it also obtains larger improvement in performance comparing to the previous methods to leverage text data.
References used
https://aclanthology.org/
For a computer to naturally interact with a human, it needs to be human-like. In this paper, we propose a neural response generation model with multi-task learning of generation and classification, focusing on emotion. Our model based on BART (Lewis
In open-domain dialogue response generation, a dialogue context can be continued with diverse responses, and the dialogue models should capture such one-to-many relations. In this work, we first analyze the training objective of dialogue models from
Transformer models are permutation equivariant. To supply the order and type information of the input tokens, position and segment embeddings are usually added to the input. Recent works proposed variations of positional encodings with relative posit
Numeracy plays a key role in natural language understanding. However, existing NLP approaches, not only traditional word2vec approach or contextualized transformer-based language models, fail to learn numeracy. As the result, the performance of these
There is an emerging interest in the application of natural language processing models to source code processing tasks. One of the major problems in applying deep learning to software engineering is that source code often contains a lot of rare ident