An Algorithmic Framework For Differentially Private Data Analysis on Trusted Processors

@article{Allen2018AnAF,
  title={An Algorithmic Framework For Differentially Private Data Analysis on Trusted Processors},
  author={Joshua Allen and Bolin Ding and Janardhan Kulkarni and Harsha Nori and Olga Ohrimenko and Sergey Yekhanin},
  journal={CoRR},
  year={2018},
  volume={abs/1807.00736}
}
Differential privacy has emerged as the main definition for private data analysis and machine learning. The global model of differential privacy, which assumes that users trust the data collector, provides strong privacy guarantees and introduces small errors in the output. In contrast, applications of differential privacy in commercial systems by Apple, Google, and Microsoft, use the local model. Here, users do not trust the data collector, and hence randomize their data before sending it to… CONTINUE READING
7
Twitter Mentions

Figures, Tables, and Topics from this paper.

Citations

Publications citing this paper.

References

Publications referenced by this paper.
SHOWING 1-10 OF 53 REFERENCES

The algorithmic foundations of differential privacy

Cynthia Dwork, Aaron Roth
  • Found. Trends Theor. Comput. Sci.,
  • 2014
VIEW 4 EXCERPTS
HIGHLY INFLUENTIAL

Path ORAM: an extremely simple oblivious RAM protocol

  • ACM Conference on Computer and Communications Security
  • 2013
VIEW 12 EXCERPTS
HIGHLY INFLUENTIAL

Oblivious Multi-Party Machine Learning on Trusted Processors

  • USENIX Security Symposium
  • 2016
VIEW 8 EXCERPTS
HIGHLY INFLUENTIAL

VC3: Trustworthy Data Analytics in the Cloud Using SGX

  • 2015 IEEE Symposium on Security and Privacy
  • 2015
VIEW 7 EXCERPTS
HIGHLY INFLUENTIAL

RAPPOR: Randomized Aggregatable Privacy-Preserving Ordinal Response

  • ACM Conference on Computer and Communications Security
  • 2014
VIEW 16 EXCERPTS
HIGHLY INFLUENTIAL

Similar Papers

Loading similar papers…