Seminar

DoStoVoQ: Doubly Stochastic Voronoi Vector Quantization, with applications to federated learning

Aymeric Dieuleveut (Ecole Polytechnique, Palaiseau)

March 31, 2022, 11:00–12:15

Toulouse

Room A3

MAD-Stat. Seminar

Abstract

Let's talk about random compression, bias, unitary invariant codebooks, and gradient weights distribution! The growing size of models and datasets have made distributed implementation of stochastic gradient descent (SGD) an active field of research. However the high bandwidth cost of communicating gradient updates between nodes remains a bottleneck; lossy compression is a way to alleviate this problem. We propose a new unbiased Vector Quantizer (VQ), named StoVoQ, to perform gradient quantization. This approach relies on introducing randomness within the quantization process, that is based on the use of unitarily invariant random codebooks and on a straightforward bias compensation method. The distortion of StoVoQ significantly improves upon existing quantization algorithms. Next, we explain how to combine this quantization scheme within a Federated Learning framework for complex high-dimensional model (dimension $>10^{6}$ ), introducing DoStoVoQ. We provide theoretical guarantees on the quadratic error and (absence of) bias of the compressor, that allow to leverage strong theoretical results of convergence, eg, with heterogeneous workers or variance reduction. Finally, we show that training on convex and non-convex deep learning problems, our method leads to significant reduction of bandwidth use while preserving model accuracy. Joint work with Louis Leconte, AD, Edouard Oyallon, Eric Moulines, Gilles Pages