# Rehearsal of defence

Hello everyone,

On Tuesday, $29^{\text{th}}$ September I will make a rehearsal for my Ph.D. defence.

Title: Proximal Optimization with Automatic Dimension Reduction for Large Scale Learning.

Abstract: In this thesis, we develop a framework to reduce the dimensionality of composite optimization problems with sparsity inducing regularizers. Based on the identification property of proximal methods, we first develop a sketch-and-project” method that uses projections based on the structure of the correct point. This method allows to work with random low-dimensional subspaces instead of considering the full space in the cases when the final solution is sparse. Second, we place ourselves in the context of the delay-tolerant asynchronous proximal methods and use our dimension reduction technique to decrease the total size of communications. However, this technique is proven to converge only for well-conditioned problems both in theory in practice. Thus, we investigate wrapping it up into a proximal reconditioning framework. This leads to a theoretically backed algorithm that is guaranteed to cost less in terms of communications compared with a non-sparsified version; we show in practice that it implies faster runtime convergence when the sparsity of the problem is suficiently big.

Talk will take a place in Batiment IMAG in the room $\boldsymbol{106}$ (first floor) at $\boldsymbol{14^{30}}$ on Tuesday, $\boldsymbol{29^{\text{th}}}$ September.

All are invited to attend.