#### Filter Results:

- Full text PDF available (55)

#### Publication Year

1986

2017

- This year (5)
- Last 5 years (39)
- Last 10 years (56)

#### Publication Type

#### Co-author

#### Journals and Conferences

#### Data Set Used

#### Key Phrases

Learn More

- Moritz Hardt, Kunal Talwar
- STOC
- 2010

We consider the noise complexity of differentially private mechanisms in the setting where the user asks d linear queries f:R<sup>n</sup> -> R non-adaptively. Here, the database is represented by a vector in R and proximity between databases is measured in the l<sub>1</sub>-metric. We show that the noise complexity is determined by two geometric… (More)

- Moritz Hardt, Guy N. Rothblum
- 2010 IEEE 51st Annual Symposium on Foundations of…
- 2010

We consider statistical data analysis in the interactive setting. In this setting a trusted curator maintains a database of sensitive information about individual participants, and releases privacy-preserving answers to queries as they arrive. Our primary contribution is a new differentially private multiplicative weights mechanism for answering a large… (More)

We study <i>fairness in classification</i>, where individuals are classified, e.g., admitted to a university, and the goal is to prevent discrimination against individuals based on their membership in some group, while maintaining utility for the classifier (the university). The main conceptual contribution of this paper is a framework for fair… (More)

- Moritz Hardt, Katrina Ligett, Frank McSherry
- NIPS
- 2012

We present a new algorithm for differentially private data release, based on a simple combination of the Exponential Mechanism with the Multiplicative Weights update rule. Our MWEM algorithm achieves what are the best known and nearly optimal theoretical guarantees, while at the same time being simple to implement and experimentally more accurate on actual… (More)

- Moritz Hardt, Benjamin Recht, Yoram Singer
- ICML
- 2016

We show that parametric models trained by a stochastic gradient method (SGM) with few iterations have vanishing generalization error. We prove our results by arguing that SGM is algorithmically stable in the sense of Bousquet and Elisseeff. Our analysis only employs elementary tools from convex and continuous optimization. We derive stability bounds for… (More)

- Moritz Hardt, Eric Price, Nathan Srebro
- NIPS
- 2016

We propose a criterion for discrimination against a specified sensitive attribute in supervised learning, where the goal is to predict some target based on available features. Assuming data about the predictor, target, and membership in the protected group are available, we show how to optimally adjust any learned predictor so as to remove discrimination… (More)

- Moritz Hardt, Eric Price
- NIPS
- 2014

We provide a new robust convergence analysis of the well-known power method for computing the dominant singular vectors of a matrix that we call the noisy power method. Our result characterizes the convergence behavior of the algorithm when a significant amount noise is introduced after each matrix-vector multiplication. The noisy power method can be seen… (More)

- Chiyuan Zhang, Samy Bengio, Moritz Hardt, Benjamin Recht, Oriol Vinyals
- ArXiv
- 2016

Despite their massive size, successful deep artificial neural networks can exhibit a remarkably small difference between training and test performance. Conventional wisdom attributes small generalization error either to properties of the model family, or to the regularization techniques used during training. Through extensive systematic experiments, we show… (More)

- Moritz Hardt
- 2014 IEEE 55th Annual Symposium on Foundations of…
- 2014

Alternating minimization is a widely used and empirically successful heuristic for matrix completion and related low-rank optimization problems. Theoretical guarantees for alternating minimization have been hard to come by and are still poorly understood. This is in part because the heuristic is iterative and non-convex in nature. We give a new algorithm… (More)

A great deal of effort has been devoted to reducing the risk of spurious scientific discoveries, from the use of sophisticated validation techniques, to deep statistical methods for controlling the false discovery rate in multiple hypothesis testing. However, there is a fundamental disconnect between the theoretical results and the practice of data… (More)