في سياق نمذجة الأحداث والتفاهم، نقترح طريقة جديدة للنمذجة التسلسل العصبية التي تأخذ تسلسلات ملحوظة جزئيا من المعرفة المنفصلة والخارجية في الاعتبار.نقوم بإنشاء AutoNcoder Neal STUITENCODER المتسلسل، والذي يستخدم إعادة تجديد Gumbel-Softmax ضمن تشفير محددة بعناية، للسماح بعملي ناجح خلال التدريب.تتمثل الفكرة الأساسية في السماح بالمعرفة المنفصلة الخارجية شبه الإشراف للتوجيه، ولكن لا تقيد، المعلمات الكامنة المتنوعة أثناء التدريب.تشير تجاربنا إلى أن نهجنا لا يتفوق فقط على خطوط أساسية متعددة وحديث الحديث في تحريض النصي السردي، ولكن أيضا التقارب بسرعة أكبر.
Within the context of event modeling and understanding, we propose a new method for neural sequence modeling that takes partially-observed sequences of discrete, external knowledge into account. We construct a sequential neural variational autoencoder, which uses Gumbel-Softmax reparametrization within a carefully defined encoder, to allow for successful backpropagation during training. The core idea is to allow semi-supervised external discrete knowledge to guide, but not restrict, the variational latent parameters during training. Our experiments indicate that our approach not only outperforms multiple baselines and the state-of-the-art in narrative script induction, but also converges more quickly.
References used
https://aclanthology.org/
In this paper, we review related literature
and introduce a new general purpose simulation engine for
distributed discrete event simulation. We implemented optimized
loop CMB algorithms as a conservative algorithm in Akka
framework. The new engin
In this paper, we focus on identifying interactive argument pairs from two posts with opposite stances to a certain topic. Considering opinions are exchanged from different perspectives of the discussing topic, we study the discrete representations f
Despite the recent advances in applying pre-trained language models to generate high-quality texts, generating long passages that maintain long-range coherence is yet challenging for these models. In this paper, we propose DiscoDVT, a discourse-aware
Relating entities and events in text is a key component of natural language understanding. Cross-document coreference resolution, in particular, is important for the growing interest in multi-document analysis tasks. In this work we propose a new mod
Automatic summarization aims to extract important information from large amounts of textual data in order to create a shorter version of the original texts while preserving its information. Training traditional extractive summarization models relies