ﻻ يوجد ملخص باللغة العربية
Syntactic parsing using dependency structures has become a standard technique in natural language processing with many different parsing models, in particular data-driven models that can be trained on syntactically annotated corpora. In this paper, we tackle transition-based dependency parsing using a Perceptron Learner. Our proposed model, which adds more relevant features to the Perceptron Learner, outperforms a baseline arc-standard parser. We beat the UAS of the MALT and LSTM parsers. We also give possible ways to address parsing of non-projective trees.
In this paper, we present an approach to improve the accuracy of a strong transition-based dependency parser by exploiting dependency language models that are extracted from a large parsed corpus. We integrated a small number of features based on the
In recent years, dependency parsing is a fascinating research topic and has a lot of applications in natural language processing. In this paper, we present an effective approach to improve dependency parsing by utilizing supertag features. We perform
We propose a headed span-based method for projective dependency parsing. In a projective tree, the subtree rooted at each word occurs in a contiguous sequence (i.e., span) in the surface order, we call the span-headword pair textit{headed span}. In t
Chinese word segmentation and dependency parsing are two fundamental tasks for Chinese natural language processing. The dependency parsing is defined on word-level. Therefore word segmentation is the precondition of dependency parsing, which makes de
In this paper, we study the problem of parsing structured knowledge graphs from textual descriptions. In particular, we consider the scene graph representation that considers objects together with their attributes and relations: this representation h