ترغب بنشر مسار تعليمي؟ اضغط هنا

In this paper, we mainly present some new solutions of the Hom-Yang-Baxter equation from Hom-algebras, Hom-coalgebras and Hom-Lie algebras, respectively. Also, we prove that these solutions are all self-inverse and give some examples. Finally, we introduce the notion of Hom-Yang-Baxter systems and obtain two kinds of Hom-Yang-Baxter systems.
Hybrid automatic speech recognition (ASR) models are typically sequentially trained with CTC or LF-MMI criteria. However, they have vastly different legacies and are usually implemented in different frameworks. In this paper, by decoupling the concep ts of modeling units and label topologies and building proper numerator/denominator graphs accordingly, we establish a generalized framework for hybrid acoustic modeling (AM). In this framework, we show that LF-MMI is a powerful training criterion applicable to both limited-context and full-context models, for wordpiece/mono-char/bi-char/chenone units, with both HMM/CTC topologies. From this framework, we propose three novel training schemes: chenone(ch)/wordpiece(wp)-CTC-bMMI, and wordpiece(wp)-HMM-bMMI with different advantages in training performance, decoding efficiency and decoding time-stamp accuracy. The advantages of different training schemes are evaluated comprehensively on Librispeech, and wp-CTC-bMMI and ch-CTC-bMMI are evaluated on two real world ASR tasks to show their effectiveness. Besides, we also show bi-char(bc) HMM-MMI models can serve as better alignment models than traditional non-neural GMM-HMMs.
In this work, to measure the accuracy and efficiency for a latency-controlled streaming automatic speech recognition (ASR) application, we perform comprehensive evaluations on three popular training criteria: LF-MMI, CTC and RNN-T. In transcribing so cial media videos of 7 languages with training data 3K-14K hours, we conduct large-scale controlled experimentation across each criterion using identical datasets and encoder model architecture. We find that RNN-T has consistent wins in ASR accuracy, while CTC models excel at inference efficiency. Moreover, we selectively examine various modeling strategies for different training criteria, including modeling units, encoder architectures, pre-training, etc. Given such large-scale real-world streaming ASR application, to our best knowledge, we present the first comprehensive benchmark on these three widely used training criteria across a great many languages.
In this paper, we construct a kind of new braided monoidal category over two Hom-Hopf algerbas $(H,alpha)$ and $(B,beta)$ and associate it with two nonlinear equations. We first introduce the notion of an $(H,B)$-Hom-Long dimodule and show that the H om-Long dimodule category $^{B}_{H} Bbb L$ is an autonomous category. Second, we prove that the category $^{B}_{H} Bbb L$ is a braided monoidal category if $(H,alpha)$ is quasitriangular and $(B,beta)$ is coquasitriangular and get a solution of the quantum Yang-Baxter equation. Also, we show that the category $^{B}_{H} Bbb L$ can be viewed as a subcategory of the Hom-Yetter-Drinfeld category $^{Ho B}_{Ho B} Bbb {HYD}$. Finally, we obtain a solution of the Hom-Long equation from the Hom-Long dimodules.
We compare a Gromov hyperbolic metric with the hyperbolic metric in the unit ball or in the upper half space, and prove sharp comparison inequalities between the Gromov hyperbolic metric and some hyperbolic type metrics. We also obtain several sharp distortion inequalities for the Gromov hyperbolic metric under some families of M{o}bius transformations.
In this work, we first show that on the widely used LibriSpeech benchmark, our transformer-based context-dependent connectionist temporal classification (CTC) system produces state-of-the-art results. We then show that using wordpieces as modeling un its combined with CTC training, we can greatly simplify the engineering pipeline compared to conventional frame-based cross-entropy training by excluding all the GMM bootstrapping, decision tree building and force alignment steps, while still achieving very competitive word-error-rate. Additionally, using wordpieces as modeling units can significantly improve runtime efficiency since we can use larger stride without losing accuracy. We further confirm these findings on two internal VideoASR datasets: German, which is similar to English as a fusional language, and Turkish, which is an agglutinative language.
In this paper, we introduce the notion of the Hom-Leibniz-Rinehart algebra as an algebraic analogue of Hom-Leibniz algebroid, and prove that such an arbitrary split regular Hom-Leibniz-Rinehart algebra $L$ is of the form $L=U+sum_gamma I_gamma$ with $U$ a subspace of a maximal abelian subalgebra $H$ and any $I_gamma$, a well described ideal of $L$, satisfying $[I_gamma, I_delta]= 0$ if $[gamma] eq [delta]$. In the sequel, we develop techniques of connections of roots and weights for split Hom-Leibniz-Rinehart algebras respectively. Finally, we study the structures of tight split regular Hom-Leibniz-Rinehart algebras.
In this paper, we introduce the definition of generalized BiHom-Lie algebras and generalized BiHom-Lie admissible algebras in the category ${}_H{mathcal M}$ of left modules for any quasitriangular Hopf algebra $(H, R) $. Also, we describe the BiHom -Lie ideal structures of the BiHom-associative algebras.
Let $(L, alpha)$ be a Hom-Lie-Yamaguti superalgebra. We first introduce the representation and cohomology theory of Hom-Lie-Yamaguti superalgebras. Furthermore, we introduce the notions of generalized derivations and representations of $(L, alpha)$ a nd present some properties. Finally, we investigate the deformations of $(L, alpha)$ by choosing some suitable cohomology.
We introduce the notion of 3-Hom-Lie-Rinehart algebra and systematically describe a cohomology complex by considering coefficient modules. Furthermore, we consider extensions of a 3-Hom-Lie-Rinehart algebra and characterize the first cohomology space in terms of the group of automorphisms of an $A$-split abelian extension and the equivalence classes of $A$-split abelian extensions. Finally, we study formal deformations of 3-Hom-Lie-Rinehart algebras.
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا