نقدم نتائج المهام المشتركة WMT2021 في MT غير المنضدة والموارد منخفضة للغاية.في هذه المهمة، درس المجتمع ترجمة الموارد المنخفضة جدا بين اللغة الألمانية والصربية العليا، والترجمة غير المنخفضة بين الترجمة من اللغة الألمانية والسوربية والمنخفضة الموارد بين الروسية والجواد، وجميع لغات الأقليات مع المجتمعات اللغوية النشطة تعمل على الحفاظ على اللغات، والذين هم شركاء فيالتقييم.شكرا بذلك، تمكنا من الحصول على معظم البيانات الرقمية المتاحة لهذه اللغات وتقديمها للمشاركين في المهام.في المجموع، شارك ست فرق في المهمة المشتركة.تناقش الورقة الخلفية، وتعرض المهام والنتائج، ويناقش أفضل الممارسات للمستقبل.
We present the findings of the WMT2021 Shared Tasks in Unsupervised MT and Very Low Resource Supervised MT. Within the task, the community studied very low resource translation between German and Upper Sorbian, unsupervised translation between German and Lower Sorbian and low resource translation between Russian and Chuvash, all minority languages with active language communities working on preserving the languages, who are partners in the evaluation. Thanks to this, we were able to obtain most digital data available for these languages and offer them to the task participants. In total, six teams participated in the shared task. The paper discusses the background, presents the tasks and results, and discusses best practices for the future.
References used
https://aclanthology.org/
In this paper, we present the systems submitted by our team from the Institute of ICT (HEIG-VD / HES-SO) to the Unsupervised MT and Very Low Resource Supervised MT task. We first study the improvements brought to a baseline system by techniques such
This paper presents the submission of Huawei Translation Service Center (HW-TSC) to WMT 2021 Triangular MT Shared Task. We participate in the Russian-to-Chinese task under the constrained condition. We use Transformer architecture and obtain the best
This paper describes the NoahNMT system submitted to the WMT 2021 shared task of Very Low Resource Supervised Machine Translation. The system is a standard Transformer model equipped with our recent technique of dual transfer. It also employs widely
We report the results of the WMT 2021 shared task on Quality Estimation, where the challenge is to predict the quality of the output of neural machine translation systems at the word and sentence levels. This edition focused on two main novel additio
The machine translation efficiency task challenges participants to make their systems faster and smaller with minimal impact on translation quality. How much quality to sacrifice for efficiency depends upon the application, so participants were encou