كان هناك تقدم كبير في مجال الإجابة على الأسئلة الاستخراجية (EQA) في السنوات الأخيرة.ومع ذلك، فإن معظمهم يعتمدون على التوضيحية الخاصة بالإجابة في الممرات المقابلة.في هذا العمل، نتعلم مشكلة EQA عندما لا توجد شروح موجودة للإجابة فترة الإجابة، أي، عندما تحتوي DataSet على أسئلة فقط والممرات المقابلة.تعتمد طريقتنا على الترميز التلقائي للسؤال الذي يؤدي سؤالا يرد على المهمة أثناء الترميز ومهمة توليد الأسئلة أثناء فك التشفير.نظهر أن طريقتنا تعمل بشكل جيد في إعداد صفرية ويمكن أن توفر خسارة إضافية لتعزيز الأداء ل EQA.
There has been a significant progress in the field of Extractive Question Answering (EQA) in the recent years. However, most of them are reliant on annotations of answer-spans in the corresponding passages. In this work, we address the problem of EQA when no annotations are present for the answer span, i.e., when the dataset contains only questions and corresponding passages. Our method is based on auto-encoding of the question that performs a question answering task during encoding and a question generation task during decoding. We show that our method performs well in a zero-shot setting and can provide an additional loss to boost performance for EQA.
References used
https://aclanthology.org/
This paper describes N-XKT (Neural encoding based on eXplanatory Knowledge Transfer), a novel method for the automatic transfer of explanatory knowledge through neural encoding mechanisms. We demonstrate that N-XKT is able to improve accuracy and gen
In simple open-domain question answering (QA), dense retrieval has become one of the standard approaches for retrieving the relevant passages to infer an answer. Recently, dense retrieval also achieved state-of-the-art results in multi-hop QA, where
Predicting the answer to a product-related question is an emerging field of research that recently attracted a lot of attention. Answering subjective and opinion-based questions is most challenging due to the dependency on customer generated content.
Multilingual question answering over knowledge graph (KGQA) aims to derive answers from a knowledge graph (KG) for questions in multiple languages. To be widely applicable, we focus on its zero-shot transfer setting. That is, we can only access train
Extractive text summarization aims at extracting the most representative sentences from a given document as its summary. To extract a good summary from a long text document, sentence embedding plays an important role. Recent studies have leveraged gr