Adaptive Catalyst for smooth convex optimization

Abstract

In this paper, we present the generic framework that allows accelerating almost arbitrary non-accelerated deterministic and randomized algorithms for smooth convex optimization problems. The main approach of our \emph{envelope} is the same as in \textit{Catalyst} (Lin et., al., 2015)$:$ an accelerated proximal outer gradient method, which is used as an envelope for the non-accelerated inner method for the $\ell_2$ regularized auxiliary problem. Our algorithm has two key differences$:$ $1)$ easily verifiable stopping criteria for inner algorithm; $2)$ the regularization parameter is capable of modification. As a result, the main contribution of this paper is a new framework that applies to adaptive inner algorithms$:$ Steepest Descent, Adaptive Coordinate Descent, Alternating Minimization. Moreover, in the non-adaptive case, our approach allows obtaining Catalyst without a logarithmic factor, which appears in the standard Catalyst (Lin et., al., 2015, 2018).