Research Fellow, Stanford University
Wednesday, February 13, 2019 @ 11:00am-12:00pm
Room 1242, CSE Building
A central algorithmic question is when can we efficiently sample from probability distributions? In the continuous setting, convex sets and more generally log-concave distributions are the main objects from which we can efficiently produce samples. In this talk, I will define a parallel theory of convexity for discrete distributions and show how to use this theory to obtain efficient sampling, counting, and optimization algorithms. The main property used in these algorithms is the lack of bottlenecks in the geometry of local transitions.
Examples of distributions that fall in this class include uniform distributions on bases or independent sets of matroids, determinantal point processes and fractional powers of them, and the random cluster (or Potts) model in the ``negative dependence regime'’ of q<1. I will discuss consequences of our efficient sampling algorithm, which include diverse sampling in machine learning and estimation algorithms for the reliability of road/communication networks, linear codes, and trusses/structures.
Nima Anari is a research fellow in the Simons Institute for the Theory of Computing, and a researcher at Stanford University, department of computer science. Prior to his current role, he obtained his Ph.D. in computer science from UC Berkeley, advised by Satish Rao, and continued as a postdoctoral scholar at Stanford University, MS&E department, where he was hosted by Amin Saberi. He has received a Microsoft Research Fellowship, a Simons-Berkeley Research Fellowship, and a Berkeley Graduate Studies Fellowship.
He is interested in the design and analysis of algorithms, with a particular focus on the interplay between combinatorics, discrete probability distributions, and geometry of polynomials.
Faculty Host: Shachar Lovett