Skip to Main content Skip to Navigation
Preprints, Working Papers, ...

Subgradient sampling for nonsmooth nonconvex minimization

Abstract : Risk minimization for nonsmooth nonconvex problems naturally leads to firstorder sampling or, by an abuse of terminology, to stochastic subgradient descent. We establish the convergence of this method in the path-differentiable case, and describe more precise results under additional geometric assumptions. We recover and improve results from Ermoliev-Norkin by using a different approach: conservative calculus and the ODE method. In the definable case, we show that first-order subgradient sampling avoids artificial critical point with probability one and applies moreover to a large range of risk minimization problems in deep learning, based on the backpropagation oracle. As byproducts of our approach, we obtain several results on integration of independent interest, such as an interchange result for conservative derivatives and integrals, or the definability of set-valued parameterized integrals.
Document type :
Preprints, Working Papers, ...
Complete list of metadata

https://hal.archives-ouvertes.fr/hal-03579383
Contributor : Tam Le Connect in order to contact the contributor
Submitted on : Friday, February 25, 2022 - 11:22:50 AM
Last modification on : Wednesday, June 1, 2022 - 3:52:32 AM

Files

subgradientSampling.pdf
Files produced by the author(s)

Identifiers

  • HAL Id : hal-03579383, version 2
  • ARXIV : 2202.13744

Citation

Jérôme Bolte, Tam Le, Edouard Pauwels. Subgradient sampling for nonsmooth nonconvex minimization. 2022. ⟨hal-03579383v2⟩

Share

Metrics

Record views

98

Files downloads

95