ترغب بنشر مسار تعليمي؟ اضغط هنا

Fast minimization of structured convex quartics

65   0   0.0 ( 0 )
 نشر من قبل Brian Bullins
 تاريخ النشر 2018
  مجال البحث
والبحث باللغة English
 تأليف Brian Bullins




اسأل ChatGPT حول البحث

We propose faster methods for unconstrained optimization of emph{structured convex quartics}, which are convex functions of the form begin{equation*} f(x) = c^top x + x^top mathbf{G} x + mathbf{T}[x,x,x] + frac{1}{24} mathopen| mathbf{A} x mathclose|_4^4 end{equation*} for $c in mathbb{R}^d$, $mathbf{G} in mathbb{R}^{d times d}$, $mathbf{T} in mathbb{R}^{d times d times d}$, and $mathbf{A} in mathbb{R}^{n times d}$ such that $mathbf{A}^top mathbf{A} succ 0$. In particular, we show how to achieve an $epsilon$-optimal minimizer for such functions with only $O(n^{1/5}log^{O(1)}(mathcal{Z}/epsilon))$ calls to a gradient oracle and linear system solver, where $mathcal{Z}$ is a problem-dependent parameter. Our work extends recent ideas on efficient tensor methods and higher-order acceleration techniques to develop a descent method for optimizing the relevant quartic functions. As a natural consequence of our method, we achieve an overall cost of $O(n^{1/5}log^{O(1)}(mathcal{Z} / epsilon))$ calls to a gradient oracle and (sparse) linear system solver for the problem of $ell_4$-regression when $mathbf{A}^top mathbf{A} succ 0$, providing additional insight into what may be achieved for general $ell_p$-regression. Our results show the benefit of combining efficient higher-order methods with recent acceleration techniques for improving convergence rates in fundamental convex optimization problems.



قيم البحث

اقرأ أيضاً

102 - Christian Leonard 2007
One revisits the standard saddle-point method based on conjugate duality for solving convex minimization problems. Our aim is to reduce or remove unnecessary topological restrictions on the constraint set. Dual equalities and characterizations of the minimizers are obtained with weak or without constraint qualifications. The main idea is to work with intrinsic topologies which reflect some geometry of the objective function. The abstract results of this article are applied in other papers to the Monge-Kantorovich optimal transport problem and the minimization of entropy functionals.
202 - C. Vallee , C. Zalinescu 2015
A formula for the sub-differential of the sum of a series of convex functions defined on a Banach space was provided by X. Y. Zheng in 1998. In this paper, besides a slight extension to locally convex spaces of Zhengs results, we provide a formula fo r the conjugate of a countable sum of convex functions. Then we use these results for calculating the sub-differentials and the conjugates in two situations related to entropy minimization, and we study a concrete example met in Statistical Physics.
We propose a new majorization-minimization (MM) method for non-smooth and non-convex programs, which is general enough to include the existing MM methods. Besides the local majorization condition, we only require that the difference between the direc tional derivatives of the objective function and its surrogate function vanishes when the number of iterations approaches infinity, which is a very weak condition. So our method can use a surrogate function that directly approximates the non-smooth objective function. In comparison, all the existing MM methods construct the surrogate function by approximating the smooth component of the objective function. We apply our relaxed MM methods to the robust matrix factorization (RMF) problem with different regularizations, where our locally majorant algorithm shows advantages over the state-of-the-art approaches for RMF. This is the first algorithm for RMF ensuring, without extra assumptions, that any limit point of the iterates is a stationary point.
We introduce Newton-ADMM, a method for fast conic optimization. The basic idea is to view the residuals of consecutive iterates generated by the alternating direction method of multipliers (ADMM) as a set of fixed point equations, and then use a nons mooth Newton method to find a solution; we apply the basic idea to the Splitting Cone Solver (SCS), a state-of-the-art method for solving generic conic optimization problems. We demonstrate theoretically, by extending the theory of semismooth operators, that Newton-ADMM converges rapidly (i.e., quadratically) to a solution; empirically, Newton-ADMM is significantly faster than SCS on a number of problems. The method also has essentially no tuning parameters, generates certificates of primal or dual infeasibility, when appropriate, and can be specialized to solve specific convex problems.
The stochastic Frank-Wolfe method has recently attracted much general interest in the context of optimization for statistical and machine learning due to its ability to work with a more general feasible region. However, there has been a complexity ga p in the guaranteed convergence rate for stochastic Frank-Wolfe compared to its deterministic counterpart. In this work, we present a new generalized stochastic Frank-Wolfe method which closes this gap for the class of structured optimization problems encountered in statistical and machine learning characterized by empirical loss minimization with a certain type of ``linear prediction property (formally defined in the paper), which is typically present loss minimization problems in practice. Our method also introduces the notion of a ``substitute gradient that is a not-necessarily-unbiased sample of the gradient. We show that our new method is equivalent to a particular randomized coordinate mirror descent algorithm applied to the dual problem, which in turn provides a new interpretation of randomized dual coordinate descent in the primal space. Also, in the special case of a strongly convex regularizer our generalized stochastic Frank-Wolfe method (as well as the randomized dual coordinate descent method) exhibits linear convergence. Furthermore, we present computational experiments that indicate that our method outperforms other stochastic Frank-Wolfe methods consistent with the theory developed herein.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا