في هذه الورقة، نقدم تفاصيل النظم التي قدمناها مقابل WAT 2021 Multiindicmt: مهمة متعددة اللغات.لقد قدمنا نماذج NMT متعددة اللغات منفصلة: واحد للغة الإنجليزية إلى 10 لغات ind وآخر ل 10 لغات ind للغة الإنجليزية.نناقش تفاصيل تنفيذ نهجين منفصلين متعدد اللغات NMT، وهما واحدا وكثير من الأحيان والعديد من إلى واحد، والذي يستفيد من وحدة فك ترميز مشتركة ومشمير مشترك، على التوالي.من تجاربنا، نلاحظ أن أنظمة NMT متعددة اللغات تتفوق على أنظمة طيران الأساس ثنائية اللغة لكل من أزواج اللغة قيد الدراسة.
In this paper, we present the details of the systems that we have submitted for the WAT 2021 MultiIndicMT: An Indic Language Multilingual Task. We have submitted two separate multilingual NMT models: one for English to 10 Indic languages and another for 10 Indic languages to English. We discuss the implementation details of two separate multilingual NMT approaches, namely one-to-many and many-to-one, that makes use of a shared decoder and a shared encoder, respectively. From our experiments, we observe that the multilingual NMT systems outperforms the bilingual baseline MT systems for each of the language pairs under consideration.
References used
https://aclanthology.org/
This paper describes the work and the systems submitted by the IIIT-Hyderbad team in the WAT 2021 MultiIndicMT shared task. The task covers 10 major languages of the Indian subcontinent. For the scope of this task, we have built multilingual systems
The choice of parameter sharing strategy in multilingual machine translation models determines how optimally parameter space is used and hence, directly influences ultimate translation quality. Inspired by linguistic trees that show the degree of rel
Developing a unified multilingual model has been a long pursuing goal for machine translation. However, existing approaches suffer from performance degradation - a single multilingual model is inferior to separately trained bilingual ones on rich-res
Multilingual T5 pretrains a sequence-to-sequence model on massive monolingual texts, which has shown promising results on many cross-lingual tasks. In this paper, we improve multilingual text-to-text transfer Transformer with translation pairs (mT6).
In this paper we describe our submissions to WAT-2021 (Nakazawa et al., 2021) for English-to-Myanmar language (Burmese) task. Our team, ID: YCC-MT1'', focused on bringing transliteration knowledge to the decoder without changing the model. We manuall