We identify a trade-off between robustness and accuracy that serves as a guiding principle in the design of defenses against adversarial examples, and provide a differentiable upper bound using the theory of classification-calibrated loss.Expand

We propose a general methodology for the construction and analysis of essentially minimax estimators for a wide class of functionals of finite dimensional parameters, and elaborate on the case of discrete distributions, where the support size S is comparable with or even much larger than the number of observations n.Expand

We present \emph{Local Moment Matching (LMM)}, a unified methodology for symmetric functional estimation and distribution estimation under Wasserstein distance.Expand

Four estimators of the directed information rate between a pair of jointly stationary ergodic finite-alphabet processes are proposed, based on universal probability assignments.Expand

We consider the problem of estimating the L1 distance between two discrete probability measures P and Q from empirical data in a nonasymptotic and large alphabet setting.Expand

We propose an efficient algorithm for approximate computation of the profile maximum likelihood, a variant of maximum likelihood maximizing the probability of observing a sufficient statistic rather than the empirical sample.Expand

We study concentration inequalities for the Kullback--Leibler (KL) divergence between the empirical distribution and the true distribution, and the difference between concentration around the expectation or zero.Expand

We consider the problem of estimating functionals of discrete distributions, and focus on a tight (up to universal multiplicative constants for each specific functional) nonasymptotic analysis of the worst case squared error risk of widely used estimators.Expand

We consider the problem of estimating the $L_{1}$ distance between two discrete probability measures $P$ and $Q$ from empirical data in a nonasymptotic and large alphabet setting.Expand