Distributed Optimization with Sparse Communications and Structure Identification

Abstract

We propose an efficient distributed algorithm for solving regularized learning problems. In a distributed framework with a master machine coordinating the computations of many slave machines, our proximal-gradient algorithm allows local computations and sparse communications from slaves to master. Furthermore, with the $\ell_1$-regularizer, our approach automatically identifies the support of the solution, leading to sparse communications from master to slaves, with near-optimal support. We thus obtain an algorithm with two-way sparse communications.

Date
Location
Bordeaux, France
Avatar
Dmitry Grishchenko
PhD student in Applied Mathematics