Physics ∩ ML

a virtual hub at the interface of theoretical physics and deep learning.

09 Sep 2020

Insights on gradient-based algorithms in high-dimensional learning.

Lenka Zdeborova, EPFL, 12:00 EDT

Abstract: Gradient descent algorithms and their noisy variants, such as the Langevin dynamics or multi-pass SGD, are at the center of attention in machine learning. Yet their behaviour remains perplexing, in particular in the high-dimensional non-convex setting. In this talk, I will present several high-dimensional and (mostly) non-convex statistical learning problems in which the performance of gradient-based algorithms can be analysed down to a constant. The common point of these settings is that the data come from a probabilistic generative model leading to problems for which, in the high-dimensional limit, statistical physics provides exact closed solutions for the performance of the gradient-based algorithms. The covered settings include the spiked mixed matrix-tensor model, the perceptron or phase retrieval.

Slides and video of the talk are both available.

Series