#### Filter Results:

#### Publication Year

2007

2015

#### Co-author

#### Key Phrase

#### Publication Venue

Learn More

We introduce the notion of <i>restricted sensitivity</i> as an alternative to global and smooth sensitivity to improve accuracy in differentially private data analysis. The definition of restricted sensitivity is similar to that of global sensitivity except that instead of quantifying over all possible datasets, we take advantage of any beliefs about the… (More)

Clustering under most popular objective functions is NP-hard, even to approximate well, and so unlikely to be efficiently solvable in the worst case. Recently, Bilu and Linial [11] suggested an approach aimed at bypassing this computational barrier by using properties of instances one might hope to hold in practice. In particular, they argue that instances… (More)

Emek et al presented a model of probabilistic single-item second price auctions where an auctioneer who is informed about the type of an item for sale, broadcasts a signal about this type to uninformed bidders. They proved that finding the optimal (for the purpose of generating revenue) <i>pure</i> signaling scheme is strongly NP-hard. In contrast, we prove… (More)

We adopt a utilitarian perspective on social choice, assuming that agents have (possibly latent) utility functions over some space of alternatives. For many reasons one might consider mechanisms, or <i>social choice functions</i>, that only have access to the ordinal rankings of alternatives by the individual agents rather than their utility functions. In… (More)

This work concerns learning probabilistic models for ranking data in a heterogeneous population. The specific problem we study is learning the parameters of a Mallows Mixture Model. Despite being widely studied, current heuristics for this problem do not have theoretical guarantees and can get stuck in bad local optima. We present the first polynomial time… (More)

Aiming to unify known results about clustering mixtures of distributions under separation conditions, Kumar and Kannan [KK10] introduced a deterministic condition for clustering datasets. They showed that this single deterministic condition encompasses many previously studied clustering assumptions. More specifically, their proximity condition requires that… (More)

—We consider k-median clustering in finite metric spaces and k-means clustering in Euclidean spaces, in the setting where k is part of the input (not a constant). For the k-means problem, Ostrovsky et al. [18] show that if the optimal (k −1)-means clustering of the input is more expensive than the optimal k-means clustering by a factor of 1/ǫ 2 , then one… (More)

This paper proves that an " old dog " , namely – the classical Johnson-Lindenstrauss transform, " performs new tricks " – it gives a novel way of preserving differential privacy. We show that if we take two databases, D and D , such that (i) D − D is a rank-1 matrix of bounded norm and (ii) all singular values of D and D are sufficiently large, then… (More)

For a graph G and an integer t we let mcc t (G) be the smallest m such that there exists a coloring of the vertices of G by t colors with no monochromatic connected subgraph having more than m vertices. Let F be any nontrivial minor-closed family of graphs. We show that mcc 2 (G) = O(n 2/3) for any n-vertex graph G ∈ F. This bound is asymptotically optimal… (More)