تمثيل المعنى التجريدي (AMR) هو لغة تمثيل معنى رسومي مصممة لتمثيل معلومات الاقتراح حول هيكل الوسيطة. ومع ذلك، فإنه غير قادر في الوقت الحاضر على تمثيل السياقات غير التابعة غير التابعة بشكل مرضي، وغالبا ما ترخيص الاستدلالات غير اللائقة. في هذه الورقة، نظهر كيفية حل مشكلة عدم اليريدية دون جاذبية الرسوم البيانية الطبقات من خلال رسم خرائط من AMRS في حساب التفاضل والتكامل Lambda المكتوبة ببساطة (STLC). على الأقل بالنسبة لبعض الحالات، يتطلب ذلك إدخال دور جديد: المحتوى الذي يعمل كمشغل متباين. الترجمة المقترحة مستوحاة من أدب اللغويات الرسمية في دلالات الأحداث في تقارير الموقف. بعد ذلك، نتعلم تفاعل نطاق الكمي والمشغلين المتهمين في غمائم دي / دي ديكتو المزعومة. نعتمد عقدة النطاق من الأدب وتوفير دلالات صريحة متعددة الأبعاد تستخدم تخزين كوبر يتيح لنا أن تستمد قراءات DE RE و De De Di Dicto بالإضافة إلى قراءات نطاق الوسيط والتي تثبت صعوبة في الحسابات دون عقدة نطاق.
Abstract Meaning Representation (AMR) is a graphical meaning representation language designed to represent propositional information about argument structure. However, at present it is unable to satisfyingly represent non-veridical intensional contexts, often licensing inappropriate inferences. In this paper, we show how to resolve the problem of non-veridicality without appealing to layered graphs through a mapping from AMRs into Simply-Typed Lambda Calculus (STLC). At least for some cases, this requires the introduction of a new role :content which functions as an intensional operator. The translation proposed is inspired by the formal linguistics literature on the event semantics of attitude reports. Next, we address the interaction of quantifier scope and intensional operators in so-called de re/de dicto ambiguities. We adopt a scope node from the literature and provide an explicit multidimensional semantics utilizing Cooper storage which allows us to derive the de re and de dicto scope readings as well as intermediate scope readings which prove difficult for accounts without a scope node.
References used
https://aclanthology.org/
Abstract Meaning Representation (AMR) is a sentence-level meaning representation based on predicate argument structure. One of the challenges we find in AMR parsing is to capture the structure of complex sentences which expresses the relation between
This paper introduces a novel approach to learn visually grounded meaning representations of words as low-dimensional node embeddings on an underlying graph hierarchy. The lower level of the hierarchy models modality-specific word representations, co
Neural networks are the state-of-the-art method of machine learning for many problems in NLP. Their success in machine translation and other NLP tasks is phenomenal, but their interpretability is challenging. We want to find out how neural networks r
Through examining some prophetic texts which include numbers, following its
explanation and analyzing the scholars`statements (WORDS); this research tries to
collect the evidence which were mentioned by the scholars who prefer to take the
number`s
The field of NLP has made substantial progress in building meaning representations. However, an important aspect of linguistic meaning, social meaning, has been largely overlooked. We introduce the concept of social meaning to NLP and discuss how insights from sociolinguistics can inform work on representation learning in NLP. We also identify key challenges for this new line of research.