تهدف استخراج العلاقات الزمنية الفائقة (FINETEMPRL) إلى الاعتراف بتذكير فترات الزمن والجدول الزمني في النص.جزء مفقود في نماذج التعلم العميقة الحالية ل Finetemprel هو فشلهم في استغلال الهياكل النحوية لجمل المدخلات لإثراء ناقلات التمثيل.في هذا العمل، نقترح ملء هذه الفجوة من خلال إدخال طرق جديدة لإدماج الهياكل النحوية في نماذج التعلم العميق ل Finetemprel.يركز النموذج المقترح على نوعين من المعلومات النحوية من أشجار التبعية، أي عشرات الأهمية التي تستند إلى بناء الجملة لتعلم تمثيل الكلمات والاتصالات النحوية لتحديد كلمات السياق الهامة لذكر الحدث.نقدم أيضا تقنيات جديدة لتسهيل نقل المعرفة بين المهام الفرعية في Finetempr، مما يؤدي إلى نموذج جديد مع الأداء الحديث لهذه المهمة.
Fine-grained temporal relation extraction (FineTempRel) aims to recognize the durations and timeline of event mentions in text. A missing part in the current deep learning models for FineTempRel is their failure to exploit the syntactic structures of the input sentences to enrich the representation vectors. In this work, we propose to fill this gap by introducing novel methods to integrate the syntactic structures into the deep learning models for FineTempRel. The proposed model focuses on two types of syntactic information from the dependency trees, i.e., the syntax-based importance scores for representation learning of the words and the syntactic connections to identify important context words for the event mentions. We also present two novel techniques to facilitate the knowledge transfer between the subtasks of FineTempRel, leading to a novel model with the state-of-the-art performance for this task.
References used
https://aclanthology.org/
Temporal language grounding (TLG) aims to localize a video segment in an untrimmed video based on a natural language description. To alleviate the expensive cost of manual annotations for temporal boundary labels,we are dedicated to the weakly superv
Text classifiers are regularly applied to personal texts, leaving users of these classifiers vulnerable to privacy breaches. We propose a solution for privacy-preserving text classification that is based on Convolutional Neural Networks (CNNs) and Se
Detecting events and their evolution through time is a crucial task in natural language understanding. Recent neural approaches to event temporal relation extraction typically map events to embeddings in the Euclidean space and train a classifier to
We propose a multi-task, probabilistic approach to facilitate distantly supervised relation extraction by bringing closer the representations of sentences that contain the same Knowledge Base pairs. To achieve this, we bias the latent space of senten
Open relation extraction (OpenRE) aims to extract novel relation types from open-domain corpora, which plays an important role in completing the relation schemes of knowledge bases (KBs). Most OpenRE methods cast different relation types in isolation