Accelerated Regularized Newton Methods for Minimizing Composite Convex Functions


Yurii Nesterov, Universite Catholique de Louvain, Belgium


2017.09.03 09:30-10:30


Middle Lecture Room, Math Building


In this talk, we present accelerated Regularized Newton Methods for minimizing objectives formed as a sum of two functions: one is convex and twice differentiable with Holder-continuous Hessian, and the other is a simple closed convex function. For the case in which the Holder parameter $\nu$ is known, we propose methods that take at most $O(1/\epsilon^{1/(2+\nu)})$ iterations to reduce the functional residual below a given precision $\epsilon > 0$. For the general case, in which the parameter is not known, we propose a universal method that ensures the same precision in at most $O(1/\epsilon^{2/[3(1+\nu)]})$ iterations.

These are the first second-order schemes which can automatically adapt to the appropriate level of smoothness of the objective function.

This is a joint work with Geovani Grapiglia, Univeraity of Parana (Brazil).