AMR (تمثيل المعنى التجريدي) و EDS (هياكل التبعية الابتدائية) هي تمثيلين لمعنى شعبيتين في NLP / NLU.AMR أكثر مجردة ومفاهيمية، في حين أن EDS هو أعلى مستوى منخفض، أقرب إلى الهياكل المعجمية للجمل المحددة.وبالتالي ليس من المستغرب أن تحليل EDS أسهل من تحليل عمرو.في هذا العمل، نفكر في استخدام معلومات من تحليل EDS للمساعدة في تحسين أداء تحليل عمرو.نعتمد محلل محلل ومقره انتقالي ويقترح بإضافة الرسوم البيانية EDS كيزات دلالة إضافية باستخدام تشفير رسم بياني يتكون من LSTM LETER وطبقة GCN.تبين نتائجنا التجريبية أن المعلومات الإضافية من تحليل EDS يعطي بالفعل دفعة إلى أداء محلل عمرو الأساسي المستخدمة في تجاربنا.
AMR (Abstract Meaning Representation) and EDS (Elementary Dependency Structures) are two popular meaning representations in NLP/NLU. AMR is more abstract and conceptual, while EDS is more low level, closer to the lexical structures of the given sentences. It is thus not surprising that EDS parsing is easier than AMR parsing. In this work, we consider using information from EDS parsing to help improve the performance of AMR parsing. We adopt a transition-based parser and propose to add EDS graphs as additional semantic features using a graph encoder composed of LSTM layer and GCN layer. Our experimental results show that the additional information from EDS parsing indeed gives a boost to the performance of the base AMR parser used in our experiments.
References used
https://aclanthology.org/
Abstract Several metrics have been proposed for assessing the similarity of (abstract) meaning representations (AMRs), but little is known about how they relate to human similarity ratings. Moreover, the current metrics have complementary strengths a
AM dependency parsing is a method for neural semantic graph parsing that exploits the principle of compositionality. While AM dependency parsers have been shown to be fast and accurate across several graphbanks, they require explicit annotations of t
The dominant paradigm for semantic parsing in recent years is to formulate parsing as a sequence-to-sequence task, generating predictions with auto-regressive sequence decoders. In this work, we explore an alternative paradigm. We formulate semantic
Lacking sufficient human-annotated data is one main challenge for abstract meaning representation (AMR) parsing. To alleviate this problem, previous works usually make use of silver data or pre-trained language models. In particular, one recent seq-t
We propose the Recursive Non-autoregressive Graph-to-Graph Transformer architecture (RNGTr) for the iterative refinement of arbitrary graphs through the recursive application of a non-autoregressive Graph-to-Graph Transformer and apply it to syntacti