Daniel Kane, UCSD

"The Optimality of Polynomial Regression for Agnostic Learning under Gaussian Marginals"

Daniel Kane
Monday, May 16th 2022, 2-3pm

Abstract:

We study the computational problem of learning Boolean functions over the Gaussian distribution in the presence of agnostic noise. We develop techniques for coming up with hard instances of this problem for which we can prove lower bounds in the statistical query model of computation. By using linear programming duality to construct these instances, we show that we can obtain lower bounds essentially matching the upper bounds given by the classic L1 learning algorithm.