ترغب بنشر مسار تعليمي؟ اضغط هنا

For the minimization of a nonlinear cost functional $j$ under convex constraints the relaxed projected gradient process $varphi_{k+1} = varphi_{k} + alpha_k(P_H(varphi_{k}-lambda_k abla_H j(varphi_{k}))-varphi_{k})$ is a well known method. The analy sis is classically performed in a Hilbert space $H$. We generalize this method to functionals $j$ which are differentiable in a Banach space. Thus it is possible to perform e.g. an $L^2$ gradient method if $j$ is only differentiable in $L^infty$. We show global convergence using Armijo backtracking in $alpha_k$ and allow the inner product and the scaling $lambda_k$ to change in every iteration. As application we present a structural topology optimization problem based on a phase field model, where the reduced cost functional $j$ is differentiable in $H^1cap L^infty$. The presented numerical results using the $H^1$ inner product and a pointwise chosen metric including second order information show the expected mesh independency in the iteration numbers. The latter yields an additional, drastic decrease in iteration numbers as well as in computation time. Moreover we present numerical results using a BFGS update of the $H^1$ inner product for further optimization problems based on phase field models.
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا