Corpus ID: 224814403

DuetSGX: Differential Privacy with Secure Hardware

@article{Nguyen2020DuetSGXDP,
  title={DuetSGX: Differential Privacy with Secure Hardware},
  author={Phillip Nguyen and Alex Silence and David Darais and Joseph P. Near},
  journal={ArXiv},
  year={2020},
  volume={abs/2010.10664}
}
Differential privacy offers a formal privacy guarantee for individuals, but many deployments of differentially private systems require a trusted third party (the data curator). We propose DuetSGX, a system that uses secure hardware (Intel's SGX) to eliminate the need for a trusted data curator. Data owners submit encrypted data that can be decrypted only within a secure enclave running the DuetSGX system, ensuring that sensitive data is never available to the data curator. Analysts submit… Expand
1 Citations

Figures from this paper

Expressive Authorization Policies using Computation Principals
In authorization logics, it is natural to treat computations as principals, since systems need to decide how much authority to give computations when they execute. But unlike other kinds ofExpand

References

SHOWING 1-10 OF 32 REFERENCES
Cryptϵ: Crypto-Assisted Differential Privacy on Untrusted Servers
TLDR
This work proposes Cryptε, a system and programming framework that achieves the accuracy guarantees and algorithmic expressibility of the central model without any trusted data collector like in the local model and demonstrates Cryptε's practical feasibility with extensive empirical evaluations on real world datasets. Expand
Ryoan: A Distributed Sandbox for Untrusted Computation on Secret Data
TLDR
Ryoan provides a distributed sandbox, leveraging hardware enclaves to protect sandbox instances from potentially malicious computing platforms and is designed for a request-oriented data model, where confined modules only process input once and do not persist state about the input. Expand
Honeycrisp: large-scale differentially private aggregation without a trusted core
TLDR
A system called Honeycrisp is described, whose privacy cost depends on how often the data changes, and not on howoften a query is asked, and which can answer periodic queries for many years, as long as the underlying data does not change too often. Expand
Opaque: An Oblivious and Encrypted Distributed Analytics Platform
TLDR
The proposed Opaque introduces new distributed oblivious relational operators that hide access patterns, and new query planning techniques to optimize these new operators to improve performance. Expand
ShrinkWrap: Efficient SQL Query Processing in Differentially Private Data Federations
TLDR
Shrinkwrap is introduced, a private data federation that offers data owners a differentially private view of the data held by others to improve their performance over oblivious query processing and provides a trade-off between result accuracy and query evaluation performance. Expand
Locally Differentially Private Protocols for Frequency Estimation
TLDR
This paper introduces a framework that generalizes several LDP protocols proposed in the literature and yields a simple and fast aggregation algorithm, whose accuracy can be precisely analyzed, resulting in two new protocols that provide better utility than protocols previously proposed. Expand
The Algorithmic Foundations of Differential Privacy
TLDR
The preponderance of this monograph is devoted to fundamental techniques for achieving differential privacy, and application of these techniques in creative combinations, using the query-release problem as an ongoing example. Expand
VC3: Trustworthy Data Analytics in the Cloud Using SGX
We present VC3, the first system that allows users to run distributed MapReduce computations in the cloud while keeping their code and data secret, and ensuring the correctness and completeness ofExpand
Ironclad Apps: End-to-End Security via Automated Full-System Verification
TLDR
This work provides complete, low-level software verification of a full stack of verified software, which includes a verified kernel; verified drivers; verified system and crypto libraries including SHA, HMAC, and RSA; and four Ironclad Apps. Expand
Calibrating Noise to Sensitivity in Private Data Analysis
TLDR
The study is extended to general functions f, proving that privacy can be preserved by calibrating the standard deviation of the noise according to the sensitivity of the function f, which is the amount that any single argument to f can change its output. Expand
...
1
2
3
4
...