Self-Supervised Learning of Generative Spin-Glasses with Normalizing FlowsGavin Hartnett, RAND 12:00 ET
Spin-glasses are universal models that can capture complex behavior of many-body systems at the interface of statistical physics and computer science, including discrete optimization, inference in graphical models, and automated reasoning. In this talk, I will discuss the problem of using normalizing flows to build generative models of spin-glasses. I will begin with a brief introduction to spin-glasses, and then discuss how the Hubbard-Stratonovich transformation may be used to convert the discrete Boltzmann distribution of a spin-glass into a continuous probability density. I will then discuss the problem of modeling the resulting continuous spin-glass using normalizing flows. Two approaches will be considered, one based on the forward KL divergence and one based on the reverse KL divergence. To evaluate both approaches, I will present numerical results for the Sherrington-Kirkpatrick spin-glass, which is known to exhibit rich phenomena such as replica symmetry breaking and ultrametricity. The forward KL approach is able to approximately capture these phenomena, whereas the reverse KL approach suffers from mode collapse. I will conclude with a discussion of the physical interpretation of the learned normalizing flow.