HAL will be down for maintenance from Friday, June 10 at 4pm through Monday, June 13 at 9am. More information
Skip to Main content Skip to Navigation
Preprints, Working Papers, ...

Stochastic Subgradient Descent Escapes Active Strict Saddles

Abstract : In non-smooth stochastic optimization, we establish the non-convergence of the stochastic subgradient descent (SGD) to the critical points recently called active strict saddles by Davis and Drusvyatskiy. Such points lie on a manifold M where the function f has a direction of second-order negative curvature. Off this manifold, the norm of the Clarke subdifferential of f is lower-bounded. We require two conditions on f. The first assumption is a Verdier stratification condition, which is a refinement of the popular Whitney stratification. It allows us to establish a reinforced version of the projection formula of Bolte et.al. for Whitney stratifiable functions, and which is of independent interest. The second assumption, termed the angle condition, allows to control the distance of the iterates to M. When f is weakly convex, our assumptions are generic. Consequently, generically in the class of definable weakly convex functions, the SGD converges to a local minimizer.
Complete list of metadata

Contributor : Walid Hachem Connect in order to contact the contributor
Submitted on : Tuesday, November 23, 2021 - 7:47:26 AM
Last modification on : Friday, January 14, 2022 - 3:41:33 AM
Long-term archiving on: : Thursday, February 24, 2022 - 6:16:57 PM


Files produced by the author(s)


  • HAL Id : hal-03442137, version 1


Pascal Bianchi, Walid Hachem, Sholom Schechtman. Stochastic Subgradient Descent Escapes Active Strict Saddles. 2021. ⟨hal-03442137⟩



Record views


Files downloads