• Publications
  • Influence
Optimal Testing for Properties of Distributions
TLDR
We provide a general approach via which we obtain sample-optimal and computationally efficient testers for all these distribution families. Expand
  • 123
  • 18
  • PDF
A Unified Maximum Likelihood Approach for Estimating Symmetric Properties of Discrete Distributions
TLDR
We show that a single, simple, plug-in estimator—profile maximum likelihood (PML)– is sample competitive for all symmetric properties, and in particular is asymptotically sampleoptimal for all the above properties. Expand
  • 51
  • 13
  • PDF
Multilevel thresholding for image segmentation through a fast statistical recursive algorithm
TLDR
A novel algorithm is proposed for segmenting an image into multiple levels using its mean and variance. Expand
  • 228
  • 12
  • PDF
Sample-Optimal Density Estimation in Nearly-Linear Time
TLDR
We design a new, fast algorithm for agnostically learning univariate probability distributions whose densities are well approximated by piecewise polynomial functions. Expand
  • 72
  • 8
  • PDF
Hadamard Response: Estimating Distributions Privately, Efficiently, and with Little Communication
TLDR
We study the problem of estimating $k$-ary distributions under $\varepsilon$-local differential privacy. Expand
  • 52
  • 8
  • PDF
Testing Poisson Binomial Distributions
TLDR
We provide a sample near-optimal algorithm for testing whether a distribution $P$ supported on $\{0,...,n\}$ to which we have sample access is a Poisson Binomial distribution, or far from all Poisson binomial distributions. Expand
  • 27
  • 7
  • PDF
INSPECTRE: Privately Estimating the Unseen
TLDR
We develop differentially private methods for estimating various distributional properties, including support size, support coverage, and entropy. Expand
  • 22
  • 5
  • PDF
Competitive Classification and Closeness Testing
We study the problems of classification and closeness testing. A classifier associates a test sequence with the one of two training sequences that was generated by the same distribution. A closenessExpand
  • 45
  • 4
  • PDF
Improved bounds for universal one-bit compressive sensing
TLDR
We show how to recover the support of sparse high-dimensional vectors in the 1-bit compressive sensing framework with an asymptotically near-optimal number of measurements. Expand
  • 11
  • 4
  • PDF
Inference Under Information Constraints I: Lower Bounds From Chi-Square Contraction
TLDR
We derive lower bounds for the sample complexity of learning and testing discrete distributions in this information-constrained setting by studying the contraction in chi-square distance between the observed distributions of the samples when information constraints are placed. Expand
  • 48
  • 3
  • PDF