Document de travail

FastPart: Over-Parameterized Stochastic Gradient Descent for Sparse optimisation on Measures

Sébastien Gadat, Yohann De Castro et Clément Marteau

Résumé

This paper presents a novel algorithm that leverages Stochastic Gradient Descent strategies in con-junction with Random Features to augment the scalability of Conic Particle Gradient Descent (CPGD) specifically tailored for solving sparse optimisation problems on measures. By formulating the CPGD steps within a variational framework, we provide rigorous mathematical proofs demonstrating the fol-lowing key findings: (i) The total variation norms of the solution measures along the descent trajectory remain bounded, ensuring stability and preventing undesirable divergence; (ii) We establish a global convergence guarantee with a convergence rate of O(log(K)/√K) over K iterations, showcasing the efficiency and effectiveness of our algorithm, (iii) Additionally, we analyze and establish local control over the first-order condition discrepancy, contributing to a deeper understanding of the algorithm’s behavior and reliability in practical applications.

Référence

Sébastien Gadat, Yohann De Castro et Clément Marteau, « FastPart: Over-Parameterized Stochastic Gradient Descent for Sparse optimisation on Measures », TSE Working Paper, n° 23-1494, décembre 2023.

Voir aussi

Publié dans

TSE Working Paper, n° 23-1494, décembre 2023