Physics ∩ ML

a virtual hub at the interface of theoretical physics and deep learning.

27 Jul 2022

Minerva: Solving Quantitative Reasoning Problems with Language Models

Guy Gur-Ari, Google Brain

Abstract:
Quantitative reasoning tasks which can involve mathematics, science, and programming are often challenging for machine learning models in general and for language models in particular. We show that transformer-based language models obtain significantly better performance on math and science questions when trained in an unsupervised way on a large, math-focused dataset. Performance can be further improved using prompting and sampling techniques including chain-of-thought and majority voting. Minerva, a model that combines these techniques, achieves SOTA on several math and science benchmarks. I will describe the model, its capabilities and limitations.

video

Series