معظم أساليب الإجابة على الأسئلة القائمة على المعرفة الحالية (KBQA) تعلم أولا تعيين السؤال المحدد في رسم بياني للاستعلام، ثم قم بتحويل الرسم البياني إلى استعلام قابل للتنفيذ للعثور على الإجابة.عادة ما يتم توسيع الرسم البياني للاستعلام تدريجيا من كيان الموضوع بناء على نموذج تنبؤ التسلسل.في هذه الورقة، نقترح حل جديد للاستعلام عن جيل الرسم البياني الذي يعمل بالطريقة المعاكسة: نبدأ مع قاعدة المعرفة بأكملها وتقليصها تدريجيا إلى الرسم البياني للاستعلام المرغوب فيه.يعمل هذا النهج على تحسين كفاءة ودقة جيل الرسم البياني للاستعلام، خاصة بالنسبة لأسئلة قفز متعددة المعقدة.تظهر النتائج التجريبية أن طريقتنا تحقق أداء حديثة على مجموعة بيانات ComplexwebQuestion (CWQ).
Most of the existing Knowledge-based Question Answering (KBQA) methods first learn to map the given question to a query graph, and then convert the graph to an executable query to find the answer. The query graph is typically expanded progressively from the topic entity based on a sequence prediction model. In this paper, we propose a new solution to query graph generation that works in the opposite manner: we start with the entire knowledge base and gradually shrink it to the desired query graph. This approach improves both the efficiency and the accuracy of query graph generation, especially for complex multi-hop questions. Experimental results show that our method achieves state-of-the-art performance on ComplexWebQuestion (CWQ) dataset.
References used
https://aclanthology.org/
Multilingual question answering over knowledge graph (KGQA) aims to derive answers from a knowledge graph (KG) for questions in multiple languages. To be widely applicable, we focus on its zero-shot transfer setting. That is, we can only access train
Knowledge Base Question Answering (KBQA) is to answer natural language questions posed over knowledge bases (KBs). This paper targets at empowering the IR-based KBQA models with the ability of numerical reasoning for answering ordinal constrained que
Relation detection in knowledge base question answering, aims to identify the path(s) of relations starting from the topic entity node that is linked to the answer node in knowledge graph. Such path might consist of multiple relations, which we call
Prior work on Data-To-Text Generation, the task of converting knowledge graph (KG) triples into natural text, focused on domain-specific benchmark datasets. In this paper, however, we verbalize the entire English Wikidata KG, and discuss the unique c
Recently Graph Neural Network (GNN) has been used as a promising tool in multi-hop question answering task. However, the unnecessary updations and simple edge constructions prevent an accurate answer span extraction in a more direct and interpretable