No Arabic abstract
Perturbation or error bounds of functions have been of great interest for a long time. If the functions are differentiable, then the mean value theorem and Taylors theorem come handy for this purpose. While the former is useful in estimating $|f(A+X)-f(A)|$ in terms of $|X|$ and requires the norms of the first derivative of the function, the latter is useful in computing higher order perturbation bounds and needs norms of the higher order derivatives of the function. In the study of matrices, determinant is an important function. Other scalar valued functions like eigenvalues and coefficients of characteristic polynomial are also well studied. Another interesting function of this category is the permanent, which is an analogue of the determinant in matrix theory. More generally, there are operator valued functions like tensor powers, antisymmetric tensor powers and symmetric tensor powers which have gained importance in the past. In this article, we give a survey of the recent work on the higher order derivatives of these functions and their norms. Using Taylors theorem, higher order perturbation bounds are obtained. Some of these results are very recent and their detailed proofs will appear elsewhere.
In this paper we show how to approximate (learn) a function f, where X and Y are metric spaces.
A $k$-submodular function is a function that given $k$ disjoint subsets outputs a value that is submodular in every orthant. In this paper, we provide a new framework for $k$-submodular maximization problems, by relaxing the optimization to the continuous space with the multilinear extension of $k$-submodular functions and a variant of pipage rounding that recovers the discrete solution. The multilinear extension introduces new techniques to analyze and optimize $k$-submodular functions. When the function is monotone, we propose almost $frac{1}{2}$-approximation algorithms for unconstrained maximization and maximization under total size and knapsack constraints. For unconstrained monotone and non-monotone maximization, we propose an algorithm that is almost as good as any combinatorial algorithm based on Iwata, Tanigawa, and Yoshidas meta-framework ($frac{k}{2k-1}$-approximation for the monotone case and $frac{k^2+1}{2k^2+1}$-approximation for the non-monotone case).
In this paper we introduce an abstract approach to the notion of absolutely summing multilinear operators. We show that several previous results on different contexts (absolutely summing, almost summing, Cohen summing) are particular cases of our general results.
The LULU operators, well known in the nonlinear multiresolution analysis of sequences, are extended to functions defined on a continuous domain, namely, a real interval. We show that the extended operators replicate the essential properties of their discrete counterparts. More precisely, they form a fully ordered semi-group of four elements, preserve the local trend and the total variation.
Using elementary techniques, we prove sharp anisotropic Hardy-Littlewood inequalities for positive multilinear forms. In particular, we recover an inequality proved by F. Bayart in 2018.