Revisiting the Polyak step size¶
Authors: Elad Hazan, Sham Kakade
Published: 2019 ()
arXiv: 1905.00313
Summary¶
Abstract¶
This paper revisits the Polyak step size schedule for convex optimization problems, proving that a simple variant of it simultaneously attains near optimal convergence rates for the gradient descent algorithm, for all ranges of strong convexity, smoothness, and Lipschitz parameters, without a-priory knowledge of these parameters.