Do you want to publish a course? Click here

Further Results for Perron-Frobenius Theorem for Nonnegative Tensors II

199   0   0.0 ( 0 )
 Added by Yuning Yang
 Publication date 2011
  fields
and research's language is English




Ask ChatGPT about the research

In this paper, we generalize some conclusions from the nonnegative irreducible tensor to the nonnegative weakly irreducible tensor and give more properties of eigenvalue problems.



rate research

Read More

For any positive integers $r$, $s$, $m$, $n$, an $(r,s)$-order $(n,m)$-dimensional rectangular tensor ${cal A}=(a_{i_1cdots i_r}^{j_1cdots j_s}) in ({mathbb R}^n)^rtimes ({mathbb R}^m)^s$ is called partially symmetric if it is invariant under any permutation on the lower $r$ indexes and any permutation on the upper $s$ indexes. Such partially symmetric rectangular tensor arises naturally in studying directed hypergraphs. Ling and Qi [Front. Math. China, 2013] first studied the $(p,q)$-spectral radius (or singular values) and proved a Perron-Fronbenius theorem for such tensors when both $p,q geq r+s$. We improved their results by extending to all $(p,q)$ satisfying $frac{r}{p} +frac{s}{q}leq 1$. We also proved the Perron-Fronbenius theorem for general nonnegative $(r,s)$-order $(n,m)$-dimensional rectangular tensors when $frac{r}{p}+frac{s}{q}>1$. We essentially showed that this is best possible without additional conditions on $cal A$. Finally, we applied these results to study the $(p,q)$-spectral radius of $(r,s)$-uniform directed hypergraphs.
In further pursuit of the diagonalizable emph{real nonnegative inverse eigenvalue problem} (RNIEP), we study the relationship between the emph{row cone} $mathcal{C}_r(S)$ and the emph{spectracone} $mathcal{C}(S)$ of a Perron similarity $S$. In the process, a new kind of matrix, emph{row Hadamard conic} (RHC), is defined and related to the D-RNIEP. Characterizations are given when $mathcal{C}_r(S) = mathcal{C}(S)$, and explicit examples are given for all possible set-theoretic relationships between the two cones. The symmetric NIEP is the special case of the D-RNIEP in which the Perron similarity $S$ is also orthogonal.
119 - Yuning Yang , Qingzhi Yang 2011
In this paper, we mainly focus on how to generalize some conclusions from nonnegative irreducible tensors to nonnegative weakly irreducible tensors. To do so, a basic and important lemma is proven using new tools. First, we give the definition of stochastic tensors. Then we show that every nonnegative weakly irreducible tensor with spectral radius being one is diagonally similar to a unique weakly irreducible stochastic tensor. Based on it, we prove some important lemmas, which help us to generalize the results related. Some counterexamples are provided to show that some conclusions for nonnegative irreducible tensors do not hold for nonnegative weakly irreducible tensors.
117 - J.M. Chen , Z.B. Gao , E. Wicks 2019
The Frobenius-Perron theory of an endofunctor of a $Bbbk$-linear category (recently introduced in cite{CG}) provides new invariants for abelian and triangulated categories. Here we study Frobenius-Perron type invariants for derived categories of commutative and noncommutative projective schemes. In particular, we calculate the Frobenius-Perron dimension for domestic and tubular weighted projective lines, define Frobenius-Perron generalizations of Calabi-Yau and Kodaira dimensions, and provide examples. We apply this theory to the derived categories associated to certain Artin-Schelter regular and finite-dimensional algebras.
Existing tensor factorization methods assume that the input tensor follows some specific distribution (i.e. Poisson, Bernoulli, and Gaussian), and solve the factorization by minimizing some empirical loss functions defined based on the corresponding distribution. However, it suffers from several drawbacks: 1) In reality, the underlying distributions are complicated and unknown, making it infeasible to be approximated by a simple distribution. 2) The correlation across dimensions of the input tensor is not well utilized, leading to sub-optimal performance. Although heuristics were proposed to incorporate such correlation as side information under Gaussian distribution, they can not easily be generalized to other distributions. Thus, a more principled way of utilizing the correlation in tensor factorization models is still an open challenge. Without assuming any explicit distribution, we formulate the tensor factorization as an optimal transport problem with Wasserstein distance, which can handle non-negative inputs. We introduce SWIFT, which minimizes the Wasserstein distance that measures the distance between the input tensor and that of the reconstruction. In particular, we define the N-th order tensor Wasserstein loss for the widely used tensor CP factorization and derive the optimization algorithm that minimizes it. By leveraging sparsity structure and different equivalent formulations for optimizing computational efficiency, SWIFT is as scalable as other well-known CP algorithms. Using the factor matrices as features, SWIFT achieves up to 9.65% and 11.31% relative improvement over baselines for downstream prediction tasks. Under the noisy conditions, SWIFT achieves up to 15% and 17% relative improvements over the best competitors for the prediction tasks.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا