Do you want to publish a course? Click here

Neural topic models can augment or replace bag-of-words inputs with the learned representations of deep pre-trained transformer-based word prediction models. One added benefit when using representations from multilingual models is that they facilitat e zero-shot polylingual topic modeling. However, while it has been widely observed that pre-trained embeddings should be fine-tuned to a given task, it is not immediately clear what supervision should look like for an unsupervised task such as topic modeling. Thus, we propose several methods for fine-tuning encoders to improve both monolingual and zero-shot polylingual neural topic modeling. We consider fine-tuning on auxiliary tasks, constructing a new topic classification task, integrating the topic classification objective directly into topic model training, and continued pre-training. We find that fine-tuning encoder representations on topic classification and integrating the topic classification task directly into topic modeling improves topic quality, and that fine-tuning encoder representations on any task is the most important factor for facilitating cross-lingual transfer.
A synthetic fibre from polyolefins wastes was obtained as follow: 1. From polyethylene and polypropylene wastes in various percentages to get the optimum percentage in the fibre 2.From polyethylene and polypropylene gralunar. We used an indirect recycling operation using extrusion by the extruder. The obtained fibre was tested mechanically, physically and chemically and the results demonstrate that the production of these fibers are available laboratorially and industrially.
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا