نقوم بتحليل ما إذا كانت نماذج اللغة الكبيرة قادرة على التنبؤ بأنماط سلوك القراءة البشرية.قارنا أداء نماذج محولات محول خاصة باللغات ومتعددة اللغات للتنبؤ بتدابير وقت القراءة التي تعكس معالجة الجملة البشرية الطبيعية على النصوص الهولندية والإنجليزية والألمانية والروسية.ينتج عن هذا نماذج دقيقة من سلوك القراءة البشرية، والذي يشير إلى أن نماذج المحولات ترميز ضمنيا أهمية نسبية في اللغة بطريقة مماثلة لآليات المعالجة البشرية.نجد أن نماذج بيرت و XLM تتنبأ بنجاح مجموعة من ميزات تتبع العين.في سلسلة من التجارب، نحلل القدرات عبر المجال واللغات الشاملة لهذه النماذج وإظهار كيف تعكس معالجة الجملة البشرية.
We analyze if large language models are able to predict patterns of human reading behavior. We compare the performance of language-specific and multilingual pretrained transformer models to predict reading time measures reflecting natural human sentence processing on Dutch, English, German, and Russian texts. This results in accurate models of human reading behavior, which indicates that transformer models implicitly encode relative importance in language in a way that is comparable to human processing mechanisms. We find that BERT and XLM models successfully predict a range of eye tracking features. In a series of experiments, we analyze the cross-domain and cross-language abilities of these models and show how they reflect human sentence processing.
References used
https://aclanthology.org/
Pre-trained multilingual language models have become an important building block in multilingual Natural Language Processing. In the present paper, we investigate a range of such models to find out how well they transfer discourse-level knowledge acr
Pretrained multilingual language models have become a common tool in transferring NLP capabilities to low-resource languages, often with adaptations. In this work, we study the performance, extensibility, and interaction of two such adaptations: voca
This paper studies zero-shot cross-lingual transfer of vision-language models. Specifically, we focus on multilingual text-to-video search and propose a Transformer-based model that learns contextual multilingual multimodal embeddings. Under a zero-s
In primary school, children's books, as well as in modern language learning apps, multi-modal learning strategies like illustrations of terms and phrases are used to support reading comprehension. Also, several studies in educational psychology sugge
Transfer learning based on pretraining language models on a large amount of raw data has become a new norm to reach state-of-the-art performance in NLP. Still, it remains unclear how this approach should be applied for unseen languages that are not c