نقدم نظاما للتعلم أنماط التعلم المعممة أو النمطية للأحداث - أو المخططات "--- من قصص اللغة الطبيعية، وتطبيقها على إجراء تنبؤات حول القصص الأخرى.يتم تمثيل مخططاتنا منطق Episodic، وهو شكل منطقي يعكسان عن كثب اللغة الطبيعية.من خلال البدء بمجموعة "مجموعة من البروتوشما" --- مخططات أن الطفل الذي يبلغ من العمر عامين، من المحتمل أن يعرفه الطفل --- يمكننا الحصول على معرفة عالمية مفيدة وعصرية مع أمثلة قليلة جدا - - في كثير من الأحيانواحد او اثنين.يمكن دمج المخططات المستفادة في مخططات أكثر تعقيدا ومركبة، وتستخدم لإجراء تنبؤات في قصص أخرى حيث تتوفر معلومات جزئية فقط.
We present a system for learning generalized, stereotypical patterns of events---or schemas''---from natural language stories, and applying them to make predictions about other stories. Our schemas are represented with Episodic Logic, a logical form that closely mirrors natural language. By beginning with a head start'' set of protoschemas--- schemas that a 1- or 2-year-old child would likely know---we can obtain useful, general world knowledge with very few story examples---often only one or two. Learned schemas can be combined into more complex, composite schemas, and used to make predictions in other stories where only partial information is available.
References used
https://aclanthology.org/
We study a new problem of cross-lingual transfer learning for event coreference resolution (ECR) where models trained on data from a source language are adapted for evaluations in different target languages. We introduce the first baseline model for
Deep reinforcement learning (RL) methods often require many trials before convergence, and no direct interpretability of trained policies is provided. In order to achieve fast convergence and interpretability for the policy in RL, we propose a novel
Visual dialog is a task of answering a sequence of questions grounded in an image using the previous dialog history as context. In this paper, we study how to address two fundamental challenges for this task: (1) reasoning over underlying semantic st
Understanding natural language requires common sense, one aspect of which is the ability to discern the plausibility of events. While distributional models---most recently pre-trained, Transformer language models---have demonstrated improvements in m
Knowledge graphs are essential for numerous downstream natural language processing applications, but are typically incomplete with many facts missing. This results in research efforts on multi-hop reasoning task, which can be formulated as a search p