Abstract: We study the problem of minimizing a convex, non-smooth Lipschitz function over a convex domain when only noisy stochastic subgradient estimates are available. We analyze the classical Stochastic Mirror Descent (SMD) algorithm and derive new tail bounds on its optimization error, for both the averaged and the last iterate. Our results extend existing analyses - traditionally limited to light-tailed, sub-Gaussian noise - to heavier-tailed noise distributions. We specialize our general bounds to two important families of noise: one with exponential tails and another with polynomial tails. Notably, our bounds for the averaged iterate reveal a distinct two-regime behavior, highlighting new insights into the interplay between noise tails and convergence rates.
29
Jan
MIP Seminar: Andrea Paudice (Aarhus University)
Termin:
- Do.:
- 16:15 - 18:00 Uhr
29. Januar 2026