Distributed First-Order Optimization with Tamed Communications

Abstract

Many machine learning and signal processing applications involve high-dimensional nonsmooth optimization problems. The nonsmoothness is essential as it brings a low-dimensional structure to the optimal solutions, as (block, rank, or variation) sparsity. In this work, we exploit this nonsmoothness to reduce the communication cost of optimization algorithms solving these problems in a distributed setting. We introduce two key ideas$:$ i) a random subspace descent algorithm along sparsity directions; ii) an adaptative subspace selection based on sparsity identification of the proximal operator. We get significant performance improvements in terms of convergence with respect to data exchanged.

Date
Location
Toulouse, France
Avatar
Dmitry Grishchenko
Senior Engineer