• Publications
  • Influence
Theoretically Principled Trade-off between Robustness and Accuracy
TLDR
We identify a trade-off between robustness and accuracy that serves as a guiding principle in the design of defenses against adversarial examples, and provide a differentiable upper bound using the theory of classification-calibrated loss. Expand
  • 420
  • 128
  • PDF
Minimax Estimation of Functionals of Discrete Distributions
TLDR
We propose a general methodology for the construction and analysis of essentially minimax estimators for a wide class of functionals of finite dimensional parameters, and elaborate on the case of discrete distributions, where the support size S is comparable with or even much larger than the number of observations n. Expand
  • 193
  • 25
  • PDF
Local moment matching: A unified methodology for symmetric functional estimation and distribution estimation under Wasserstein distance
TLDR
We present \emph{Local Moment Matching (LMM)}, a unified methodology for symmetric functional estimation and distribution estimation under Wasserstein distance. Expand
  • 35
  • 14
  • PDF
Universal Estimation of Directed Information
TLDR
Four estimators of the directed information rate between a pair of jointly stationary ergodic finite-alphabet processes are proposed, based on universal probability assignments. Expand
  • 104
  • 12
  • PDF
Minimax estimation of the L1 distance
TLDR
We consider the problem of estimating the L1 distance between two discrete probability measures P and Q from empirical data in a nonasymptotic and large alphabet setting. Expand
  • 55
  • 7
Approximate Profile Maximum Likelihood
TLDR
We propose an efficient algorithm for approximate computation of the profile maximum likelihood, a variant of maximum likelihood maximizing the probability of observing a sufficient statistic rather than the empirical sample. Expand
  • 35
  • 5
  • PDF
Concentration Inequalities for the Empirical Distribution
TLDR
We study concentration inequalities for the Kullback--Leibler (KL) divergence between the empirical distribution and the true distribution, and the difference between concentration around the expectation or zero. Expand
  • 11
  • 4
  • PDF
Maximum Likelihood Estimation of Functionals of Discrete Distributions
TLDR
We consider the problem of estimating functionals of discrete distributions, and focus on a tight (up to universal multiplicative constants for each specific functional) nonasymptotic analysis of the worst case squared error risk of widely used estimators. Expand
  • 76
  • 3
  • PDF
Optimal rates of entropy estimation over Lipschitz balls
TLDR
We consider the problem of minimax estimation of the entropy of a density over Lipschitz balls. Expand
  • 37
  • 3
  • PDF
Minimax Estimation of the $L_1$ Distance
TLDR
We consider the problem of estimating the $L_{1}$ distance between two discrete probability measures $P$ and $Q$ from empirical data in a nonasymptotic and large alphabet setting. Expand
  • 37
  • 3
  • PDF