We show that a single, simple, plug-in estimator—profile maximum likelihood (PML)– is sample competitive for all symmetric properties, and in particular is asymptotically sampleoptimal for all the above properties.Expand

We design a new, fast algorithm for agnostically learning univariate probability distributions whose densities are well approximated by piecewise polynomial functions.Expand

We provide a sample near-optimal algorithm for testing whether a distribution $P$ supported on $\{0,...,n\}$ to which we have sample access is a Poisson Binomial distribution, or far from all Poisson binomial distributions.Expand

We develop differentially private methods for estimating various distributional properties, including support size, support coverage, and entropy.Expand

We study the problems of classification and closeness testing. A classifier associates a test sequence with the one of two training sequences that was generated by the same distribution. A closeness… Expand

We show how to recover the support of sparse high-dimensional vectors in the 1-bit compressive sensing framework with an asymptotically near-optimal number of measurements.Expand

We derive lower bounds for the sample complexity of learning and testing discrete distributions in this information-constrained setting by studying the contraction in chi-square distance between the observed distributions of the samples when information constraints are placed.Expand