Professor Nick Harvey and his collaborators give a new solution to a classical question: how many samples must one draw from a Mixture of Gaussians distribution in order to find a distribution that approximates it well? Their paper titled "Nearly tight sample complexity bounds for learning mixtures of Gaussians via sample compression schemes" determines the near-optimal number of samples for arbitrary high-dimensional Gaussians.
Their paper was received a Best Paper Award at NeurIPS 2018, the flagship conference in machine learning. Their paper was one of only four awarded out of 4854 submitted papers.
The co-authors of this work are: Hassan Ashtiani (McMaster), Shai Ben-David (Waterloo), Nicholas Harvey (UBC CS), Christopher Liaw (UBC CS), Abbas Mehrabian (McGill, formerly a UBC postdoc), and Yaniv Plan (UBC Math).