في هذه الورقة، نقدم نظام FJWU مقدم إلى المهمة المشتركة الطبية الحيوية في WMT21.أعدت أنظمة الترجمة الآلية العصبية متعددة اللغات لمدة ثلاث لغات (أي الألمانية والإسبانية والفرنسية) مع اللغة الإنجليزية كلغة مستهدفة.تم تدريب أنظمة NMT الخاصة بنا المستندة إلى بنية المحولات، على مجموعة من كورسيا الموازية داخل المجال والخروج المستخدمة باستخدام تقنيات استرجاع المعلومات (IR) وتقنيات تكيف المجال.
In this paper we present the FJWU's system submitted to the biomedical shared task at WMT21. We prepared state-of-the-art multilingual neural machine translation systems for three languages (i.e. German, Spanish and French) with English as target language. Our NMT systems based on Transformer architecture, were trained on combination of in-domain and out-domain parallel corpora developed using Information Retrieval (IR) and domain adaptation techniques.
References used
https://aclanthology.org/
This paper describes the Tencent AI Lab submission of the WMT2021 shared task on biomedical translation in eight language directions: English-German, English-French, English-Spanish and English-Russian. We utilized different Transformer architectures
This paper describes the submission of Huawei Translation Service Center (HW-TSC) to WMT21 biomedical translation task in two language pairs: Chinese↔English and German↔English (Our registered team name is HuaweiTSC). Technical details are introduced
This paper describes the Fujitsu DMATH systems used for WMT 2021 News Translation and Biomedical Translation tasks. We focused on low-resource pairs, using a simple system. We conducted experiments on English-Hausa, Xhosa-Zulu and English-Basque, and
This paper reports the optimization of using the out-of-domain data in the Biomedical translation task. We firstly optimized our parallel training dataset using the BabelNet in-domain terminology words. Afterward, to increase the training set, we stu
This paper describes Tencent Translation systems for the WMT21 shared task. We participate in the news translation task on three language pairs: Chinese-English, English-Chinese and German-English. Our systems are built on various Transformer models