Séminaire

On the SAGA Algorithm with Decreasing Step

Thiery-Emeric Gbaguidi (Université de Bordeaux; Institut de Mathématiques de Bordeaux)

28 mai 2026, 11h00–12h15

Toulouse

Salle Auditorium 5

MAD-Stat. Seminar

Résumé

Stochastic optimization naturally appear in many application areas, including machine learning. Our goal is to go further in the analysis of the Stochastic Average Gradient Accelerated (SAGA) algorithm. To achieve this, we introduce a new λ-SAGA algorithm which interpolates between the Stochastic Gradient Descent (λ = 0) and the SAGA algorithm (λ = 1). Firstly, we investigate the almost sure convergence of this new algorithm with decreasing step which allows us to avoid the restrictive strong convexity and Lipschitz gradient hypotheses associated to the objective function. Secondly, we establish a central limit theorem for the λ-SAGA algorithm. Finally, we provide the non-asymptotic Lp rates of convergence.

Voir aussi