تصف هذه الورقة التقديمات PROMT لمهمة ترجمة المصطلحات WMT21.نشارك في اتجاهين: الإنجليزية إلى الفرنسية والإنجليزية إلى الروسية.التقديمات النهائية لدينا هي النظم العصبية القائمة على mariannmt.نقدم تقنيين للترجمة المصطلحات: تعديل دينو وآخرون.(2019) نهج ضعيف مقيد ونهجنا يسمى PROMT القاموس الذكي العصبي (Smartnd).نحقق نتائج جيدة في كلا الاتجاهين.
This paper describes the PROMT submissions for the WMT21 Terminology Translation Task. We participate in two directions: English to French and English to Russian. Our final submissions are MarianNMT-based neural systems. We present two technologies for terminology translation: a modification of the Dinu et al. (2019) soft-constrained approach and our own approach called PROMT Smart Neural Dictionary (SmartND). We achieve good results in both directions.
References used
https://aclanthology.org/
This paper describes Charles University sub-mission for Terminology translation Shared Task at WMT21. The objective of this task is to design a system which translates certain terms based on a provided terminology database, while preserving high over
This paper describes Tencent Translation systems for the WMT21 shared task. We participate in the news translation task on three language pairs: Chinese-English, English-Chinese and German-English. Our systems are built on various Transformer models
In this work, two Neural Machine Translation (NMT) systems have been developed and evaluated as part of the bidirectional Tamil-Telugu similar languages translation subtask in WMT21. The OpenNMT-py toolkit has been used to create quick prototypes of
This paper discusses the WMT 2021 terminology shared task from a meta'' perspective. We present the results of our experiments using the terminology dataset and the OpenNMT (Klein et al., 2017) and JoeyNMT (Kreutzer et al., 2019) toolkits for the lan
This paper describes the Tencent AI Lab submission of the WMT2021 shared task on biomedical translation in eight language directions: English-German, English-French, English-Spanish and English-Russian. We utilized different Transformer architectures