A new article about acceleration of adaptive methods.

Title: Adaptive Catalyst for smooth convex optimization

Together with Anastasiya Ivanova, Egor Shulgin, and Alexander Gasnikov from MIPT research group, we propose the universal acceleration technique for adaptive methods for smooth convex but not strongly convex objective functions. It allows accelerating such well-known methods as Steepest Descent, Random Adaptive Coordinate Descent Method, and others where Lipschitz constant of the gradient is either unknown(expensive to compute) or changes a lot along the method trajectory.

Avatar
Dmitry Grishchenko
PhD student in Applied Mathematics