Gradient Descent Converges to Minimizers¶
Authors: Jason D. Lee, Max Simchowitz, Michael I. Jordan, Benjamin Recht
Published: 2016 ()
Algorithm: Gradient Descent
arXiv: 1602.04915
Summary¶
Abstract¶
We show that gradient descent converges to a local minimizer, almost surely with random initialization. This is proved by applying the Stable Manifold Theorem from dynamical systems theory.