تراجع الجملة هي تقنية تكييف مجال بسيطة وقوية.نقوم بإجراء تصنيف النطاق لحساب الحوسبة أوزان مع 1) نموذج اللغة Cross Entropy الفرق 2) شبكة عصبية تشفيرية 3) شبكة توتور العصبية العودية.قارنا هذه الأساليب فيما يتعلق بدقة تصنيف المجال ودراسة توزيع الاحتمالات الخلفية.ثم نقوم بتنفيذ تجارب NMT في السيناريو حيث ليس لدينا فورانيا متوازية في المجال وعلى الفورورا المحدودة في المجال.هنا ونحن نستخدم مصنف المجال للاعتقال جمل كوربوس التدريب خارج المجال.هذا يؤدي إلى تحسينات تصل إلى 2.1 بلو للترجمة الألمانية إلى الإنجليزية.
Sentence weighting is a simple and powerful domain adaptation technique. We carry out domain classification for computing sentence weights with 1) language model cross entropy difference 2) a convolutional neural network 3) a Recursive Neural Tensor Network. We compare these approaches with regard to domain classification accuracy and and study the posterior probability distributions. Then we carry out NMT experiments in the scenario where we have no in-domain parallel corpora and and only very limited in-domain monolingual corpora. Here and we use the domain classifier to reweight the sentences of our out-of-domain training corpus. This leads to improvements of up to 2.1 BLEU for German to English translation.
References used
https://aclanthology.org/
Low-resource languages can be understood as languages that are more scarce, less studied, less privileged, less commonly taught and for which there are less resources available (Singh, 2008; Cieri et al., 2016; Magueresse et al., 2020). Natural Langu
The explosion of user-generated content (UGC)---e.g. social media posts and comments and and reviews---has motivated the development of NLP applications tailored to these types of informal texts. Prevalent among these applications have been sentiment
We describe our two NMT systems submitted to the WMT2021 shared task in English-Czech news translation: CUNI-DocTransformer (document-level CUBBITT) and CUNI-Marian-Baselines. We improve the former with a better sentence-segmentation pre-processing a
Recently, disentanglement based on a generative adversarial network or a variational autoencoder has significantly advanced the performance of diverse applications in CV and NLP domains. Nevertheless, those models still work on coarse levels in the d
This paper describes the SEBAMAT contribution to the 2021 WMT Similar Language Translation shared task. Using the Marian neural machine translation toolkit, translation systems based on Google's transformer architecture were built in both directions