High Probability Convergence of Stochastic Gradient Methods

February 28, 2023 Β· Declared Dead Β· πŸ› International Conference on Machine Learning

πŸ‘» CAUSE OF DEATH: Ghosted
No code link whatsoever

"No code URL or promise found in abstract"

Evidence collected by the PWNC Scanner

Authors Zijian Liu, Ta Duy Nguyen, Thien Hang Nguyen, Alina Ene, Huy LΓͺ Nguyen arXiv ID 2302.14843 Category math.OC: Optimization & Control Cross-listed cs.DS, cs.LG Citations 61 Venue International Conference on Machine Learning Last Checked 2 months ago
Abstract
In this work, we describe a generic approach to show convergence with high probability for both stochastic convex and non-convex optimization with sub-Gaussian noise. In previous works for convex optimization, either the convergence is only in expectation or the bound depends on the diameter of the domain. Instead, we show high probability convergence with bounds depending on the initial distance to the optimal solution. The algorithms use step sizes analogous to the standard settings and are universal to Lipschitz functions, smooth functions, and their linear combinations. This method can be applied to the non-convex case. We demonstrate an $O((1+Οƒ^{2}\log(1/Ξ΄))/T+Οƒ/\sqrt{T})$ convergence rate when the number of iterations $T$ is known and an $O((1+Οƒ^{2}\log(T/Ξ΄))/\sqrt{T})$ convergence rate when $T$ is unknown for SGD, where $1-Ξ΄$ is the desired success probability. These bounds improve over existing bounds in the literature. Additionally, we demonstrate that our techniques can be used to obtain high probability bound for AdaGrad-Norm (Ward et al., 2019) that removes the bounded gradients assumption from previous works. Furthermore, our technique for AdaGrad-Norm extends to the standard per-coordinate AdaGrad algorithm (Duchi et al., 2011), providing the first noise-adapted high probability convergence for AdaGrad.
Community shame:
Not yet rated
Community Contributions

Found the code? Know the venue? Think something is wrong? Let us know!

πŸ“œ Similar Papers

In the same crypt β€” Optimization & Control

Died the same way β€” πŸ‘» Ghosted