Kobbi Nissim (Theory Seminar)

nissim.jpgTheory Seminar
"Learning threshold functions under Differential Privacy"
Kobbi Nissim
(Ben-Gurion University and CRCS@Harvard)
Tuesday, October 13, 2015, 1:00pm
EBU3B, Room 4258

Learning abstracts many of the computations performed over large collections of sensitive individual information, hence a natural task to examine in conjunction with differential privacy. Private Learning, formalized by Kasiviswanathan et al. in 2008, is the conjunction of PAC learning and differential privacy. Kasiviswanathan et al. showed that any concept class |C| can be learned privately with O(log|C|) samples, via a construction that is somewhat analogous to the Occam Razor (non-private) learner. In comparison, Theta(VC(C)) samples are neccessary and sufficient for learning without privacy.

Investigating the gap between the sample complexity and computational complexity of private and non-private learners resulted in a rich picture. We will explore parts of this picture, and in particular the sample complexity of private learning threshold functions. Our bounds give the first separation between the sample complexity of properly learning a concept class with approximate differential privacy and learning without privacy.

To obtain our results, we give reductions in both directions from releasing and properly learning thresholds and the simpler interior point problem. Given a database D of elements from an ordered domain X, the interior point problem asks for an element between the smallest and largest elements in D. We introduce new recursive constructions for bounding the sample complexity of the interior point problem, as well as further reductions and techniques for proving impossibility results for other basic problems in differential privacy.

Joint work with Mark Bun, Uri Stemmer, and Salil Vadhan [FOCS 2015]