Document de travail

Conservative set valued fields, automatic differentiation, stochastic gradient methods and deep learning

Jérôme Bolte et Edouard Pauwels

Résumé

Modern problems in AI or in numerical analysis require nonsmooth approaches with a exible calculus. We introduce generalized derivatives called conservative fields for which we develop a calculus and provide representation formulas. Functions having a conservative field are called path differentiable: convex, concave, Clarke regular and any semialgebraic Lipschitz continuous functions are path differentiable. Using Whitney stratification techniques for semialgebraic and definable sets, our model provides variational formulas for nonsmooth automatic diffrentiation oracles, as for instance the famous backpropagation algorithm in deep learning. Our differential model is applied to establish the convergence in values of nonsmooth stochastic gradient methods as they are implemented in practice.

Mots-clés

Deep Learning, Automatic differentiation, Backpropagation algorithm,; Nonsmooth stochastic optimization, Defiable sets, o-minimal structures, Stochastic gradient, Clarke subdifferential, First order methods;

Remplacé par

Jérôme Bolte et Edouard Pauwels, « Conservative set valued fields, automatic differentiation, stochastic gradient methods and deep learning », Mathematical Programming, avril 2020, p. 1–33.

Référence

Jérôme Bolte et Edouard Pauwels, « Conservative set valued fields, automatic differentiation, stochastic gradient methods and deep learning », TSE Working Paper, n° 19-1044, octobre 2019.

Voir aussi

Publié dans

TSE Working Paper, n° 19-1044, octobre 2019