نحن نحقق في التعلم التحويل بناء على نماذج الترجمة الآلية المدربة مسبقا للترجمة بين (الموارد المنخفضة) اللغات المشابهة.هذا العمل هو جزء من مساهمتنا في المهمة المشتركة لغات WMT 2021 بمثابة مهمة مشتركة حيث أرسلنا نماذج لأزواج اللغة المختلفة، بما في ذلك الفرنسية-بامبارا والإسبانية الكاتالونية والإسبانية والبرتغالية في كلا الاتجاهين.نماذجنا للكاتالان الإسبانية (82.79 بلو) والبرتغالية-الإسبانية (87.11 بلو) المرتبة الأولى في تقييم المهام المشتركة الرسمية، ونحن الفريق الوحيد لتقديم نماذج لأزواج بامبارا الفرنسية.
We investigate transfer learning based on pre-trained neural machine translation models to translate between (low-resource) similar languages. This work is part of our contribution to the WMT 2021 Similar Languages Translation Shared Task where we submitted models for different language pairs, including French-Bambara, Spanish-Catalan, and Spanish-Portuguese in both directions. Our models for Catalan-Spanish (82.79 BLEU)and Portuguese-Spanish (87.11 BLEU) rank top 1 in the official shared task evaluation, and we are the only team to submit models for the French-Bambara pairs.
References used
https://aclanthology.org/
This paper describes the SEBAMAT contribution to the 2021 WMT Similar Language Translation shared task. Using the Marian neural machine translation toolkit, translation systems based on Google's transformer architecture were built in both directions
How to effectively adapt neural machine translation (NMT) models according to emerging cases without retraining? Despite the great success of neural machine translation, updating the deployed models online remains a challenge. Existing non-parametric
Multilingual neural machine translation models typically handle one source language at a time. However, prior work has shown that translating from multiple source languages improves translation quality. Different from existing approaches on multi-sou
The main idea of this solution has been to focus on corpus cleaning and preparation and after that, use an out of box solution (OpenNMT) with its default published transformer model. To prepare the corpus, we have used set of standard tools (as Moses
Stance detection determines whether the author of a text is in favor of, against or neutral to a specific target and provides valuable insights into important events such as legalization of abortion. Despite significant progress on this task, one of