ترغب بنشر مسار تعليمي؟ اضغط هنا

Tensors in computations

149   0   0.0 ( 0 )
 نشر من قبل Lek-Heng Lim
 تاريخ النشر 2021
  مجال البحث الهندسة المعلوماتية
والبحث باللغة English
 تأليف Lek-Heng Lim




اسأل ChatGPT حول البحث

The notion of a tensor captures three great ideas: equivariance, multilinearity, separability. But trying to be three things at once makes the notion difficult to understand. We will explain tensors in an accessible and elementary way through the lens of linear algebra and numerical linear algebra, elucidated with examples from computational and applied mathematics.



قيم البحث

اقرأ أيضاً

102 - Long Chen , Xuehai Huang 2021
Several div-conforming and divdiv-conforming finite elements for symmetric tensors on simplexes in arbitrary dimension are constructed in this work. The shape function space is first split as the trace space and the bubble space. The later is further decomposed into the null space of the differential operator and its orthogonal complement. Instead of characterization of these subspaces of the shape function space, characterization of the duals spaces are provided. Vector div-conforming finite elements are firstly constructed as an introductory example. Then new symmetric div-conforming finite elements are constructed. The dual subspaces are then used as build blocks to construct divdiv conforming finite elements.
A third order real tensor is mapped to a special f-diagonal tensor by going through Discrete Fourier Transform (DFT), standard matrix SVD and inverse DFT. We call such an f-diagonal tensor an s-diagonal tensor. An f-diagonal tensor is an s-diagonal t ensor if and only if it is mapped to itself in the above process. The third order tensor space is partitioned to orthogonal equivalence classes. Each orthogonal equivalence class has a unique s-diagonal tensor. Two s-diagonal tensors are equal if they are orthogonally equivalent. Third order tensors in an orthogonal equivalence class have the same tensor tubal rank and T-singular values. Four meaningful necessary conditions for s-diagonal tensors are presented. Then we present a set of sufficient and necessary conditions for s-diagonal tensors. Such conditions involve a special complex number. In the cases that the dimension of the third mode of the considered tensor is $2, 3$ and $4$, we present direct sufficient and necessary conditions which do not involve such a complex number.
Fully adaptive computations of the resistive magnetohydrodynamic (MHD) equations are presented in two and three space dimensions using a finite volume discretization on locally refined dyadic grids. Divergence cleaning is used to control the incompre ssibility constraint of the magnetic field. For automatic grid adaptation a cell-averaged multiresolution analysis is applied which guarantees the precision of the adaptive computations, while reducing CPU time and memory requirements. Implementation issues of the open source code CARMEN-MHD are discussed. To illustrate its precision and efficiency different benchmark computations including shock-cloud interaction and magnetic reconnection are presented.
We consider the problem of recovering an orthogonally decomposable tensor with a subset of elements distorted by noise with arbitrarily large magnitude. We focus on the particular case where each mode in the decomposition is corrupted by noise vector s with components that are correlated locally, i.e., with nearby components. We show that this deterministic tensor completion problem has the unusual property that it can be solved in polynomial time if the rank of the tensor is sufficiently large. This is the polar opposite of the low-rank assumptions of typical low-rank tensor and matrix completion settings. We show that our problem can be solved through a system of coupled Sylvester-like equations and show how to accelerate their solution by an alternating solver. This enables recovery even with a substantial number of missing entries, for instance for $n$-dimensional tensors of rank $n$ with up to $40%$ missing entries.
We study the problem of finding orthogonal low-rank approximations of symmetric tensors. In the case of matrices, the approximation is a truncated singular value decomposition which is then symmetric. Moreover, for rank-one approximations of tensors of any dimension, a classical result proven by Banach in 1938 shows that the optimal approximation can always be chosen to be symmetric. In contrast to these results, this article shows that the corresponding statement is no longer true for orthogonal approximations of higher rank. Specifically, for any of the four common notions of tensor orthogonality used in the literature, we show that optimal orthogonal approximations of rank greater than one cannot always be chosen to be symmetric.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا