في هذه الورقة، نقدم النظم المقدمة من فريقنا من معهد تكنولوجيا المعلومات والاتصالات (HIGH-VD / HES-SO) إلى مهمة MT غير الخاضعة للرقابة والموارد منخفضة للغاية.ندرس أولا التحسينات التي جلبت إلى نظام أساسي من خلال تقنيات مثل الترجمة الخلفي والتهيئة من نموذج الوالدين.نجد أن كلتا التقنيتين مفيدة وكافية للوصول إلى الأداء الذي يقارن مع أنظمة أكثر تطورا من مهمة 2020.بعد ذلك، نقدم تطبيق هذا النظام إلى مهمة 2021 للمزيد من الأغراض السربية العلوي تحت الإشراف (HSB) إلى الترجمة الألمانية، في كلا الاتجاهين.أخيرا، نقدم نظاما نظعا ل HSB-DE في كلا الاتجاهين، وللترجمة الألمانية غير الخاضعة للرقابة إلى أسفل ترجمة Sorbian (DSB)، والتي تستخدم التدريب المتعدد المهام مع مختلف جداول التدريب لتحسين الخط الأساسي.
In this paper, we present the systems submitted by our team from the Institute of ICT (HEIG-VD / HES-SO) to the Unsupervised MT and Very Low Resource Supervised MT task. We first study the improvements brought to a baseline system by techniques such as back-translation and initialization from a parent model. We find that both techniques are beneficial and suffice to reach performance that compares with more sophisticated systems from the 2020 task. We then present the application of this system to the 2021 task for low-resource supervised Upper Sorbian (HSB) to German translation, in both directions. Finally, we present a contrastive system for HSB-DE in both directions, and for unsupervised German to Lower Sorbian (DSB) translation, which uses multi-task training with various training schedules to improve over the baseline.
References used
https://aclanthology.org/
We present the findings of the WMT2021 Shared Tasks in Unsupervised MT and Very Low Resource Supervised MT. Within the task, the community studied very low resource translation between German and Upper Sorbian, unsupervised translation between German
This paper presents the submission of Huawei Translation Service Center (HW-TSC) to WMT 2021 Triangular MT Shared Task. We participate in the Russian-to-Chinese task under the constrained condition. We use Transformer architecture and obtain the best
This paper describes the NoahNMT system submitted to the WMT 2021 shared task of Very Low Resource Supervised Machine Translation. The system is a standard Transformer model equipped with our recent technique of dual transfer. It also employs widely
We present our submissions to the WMT21 shared task in Unsupervised and Very Low Resource machine translation between German and Upper Sorbian, German and Lower Sorbian, and Russian and Chuvash. Our low-resource systems (German↔Upper Sorbian, Russian
Chinese character decomposition has been used as a feature to enhance Machine Translation (MT) models, combining radicals into character and word level models. Recent work has investigated ideograph or stroke level embedding. However, questions remai