تمت دراسة Graph Basic Knowledge (SKG) (SKGE) بشكل مكثف في السنوات الماضية.في الآونة الأخيرة، ظهرت شركة الرسم البياني للمعرفة (TKG) (TKGE).في هذه الورقة، نقترح إطار عمل تضمين الحقائق الزمنية العودية (RTFE) لإجراء عمليات زراعة النماذج إلى TKGS وتعزيز أداء نماذج TKGE الحالية لإكمال TKG.تختلف عن العمل السابق الذي يتجاهل استمرارية دول TKG في التطور الزمني، نتعامل مع تسلسل الرسوم البيانية كسلسلة ماركوف، والتي تحولات من الدولة السابقة إلى الحالة التالية.RTFE يأخذ Skge لتهيئة embedings of tkg.ثم تعقب Strefly State Tremition من TKG عن طريق تمرير المعلمات / ميزات محدثة بين الطوابع الزمنية.على وجه التحديد، في كل زمني، نقيب انتقال الدولة باعتباره عملية تحديث التدرج.نظرا لأن RTFE يتعلم كل طابع زمني متكرر، فيمكنه العبور بشكل طبيعي إلى الطوابع الزمنية المستقبلية.تجارب في خمس مجموعات بيانات TKG تظهر فعالية RTFE.
Static knowledge graph (SKG) embedding (SKGE) has been studied intensively in the past years. Recently, temporal knowledge graph (TKG) embedding (TKGE) has emerged. In this paper, we propose a Recursive Temporal Fact Embedding (RTFE) framework to transplant SKGE models to TKGs and to enhance the performance of existing TKGE models for TKG completion. Different from previous work which ignores the continuity of states of TKG in time evolution, we treat the sequence of graphs as a Markov chain, which transitions from the previous state to the next state. RTFE takes the SKGE to initialize the embeddings of TKG. Then it recursively tracks the state transition of TKG by passing updated parameters/features between timestamps. Specifically, at each timestamp, we approximate the state transition as the gradient update process. Since RTFE learns each timestamp recursively, it can naturally transit to future timestamps. Experiments on five TKG datasets show the effectiveness of RTFE.
References used
https://aclanthology.org/
Various temporal knowledge graph (KG) completion models have been proposed in the recent literature. The models usually contain two parts, a temporal embedding layer and a score function derived from existing static KG modeling approaches. Since the
Representation learning approaches for knowledge graphs have been mostly designed for static data. However, many knowledge graphs involve evolving data, e.g., the fact (The President of the United States is Barack Obama) is valid only from 2009 to 20
Numeracy plays a key role in natural language understanding. However, existing NLP approaches, not only traditional word2vec approach or contextualized transformer-based language models, fail to learn numeracy. As the result, the performance of these
This paper presents the first study on using large-scale pre-trained language models for automated generation of an event-level temporal graph for a document. Despite the huge success of neural pre-training methods in NLP tasks, its potential for tem
Knowledge graph embedding, representing entities and relations in the knowledge graphs with high-dimensional vectors, has made significant progress in link prediction. More researchers have explored the representational capabilities of models in rece