• Corpus ID: 224814403

DuetSGX: Differential Privacy with Secure Hardware

  title={DuetSGX: Differential Privacy with Secure Hardware},
  author={Phillip Q. Nguyen and Alex Silence and David Darais and Joseph P. Near},
Differential privacy offers a formal privacy guarantee for individuals, but many deployments of differentially private systems require a trusted third party (the data curator). We propose DuetSGX, a system that uses secure hardware (Intel's SGX) to eliminate the need for a trusted data curator. Data owners submit encrypted data that can be decrypted only within a secure enclave running the DuetSGX system, ensuring that sensitive data is never available to the data curator. Analysts submit… 
1 Citations

Figures from this paper

Expressive Authorization Policies using Computation Principals
In authorization logics, it is natural to treat computations as principals, since systems need to decide how much authority to give computations when they execute. But unlike other kinds of


Cryptϵ: Crypto-Assisted Differential Privacy on Untrusted Servers
This work proposes Cryptε, a system and programming framework that achieves the accuracy guarantees and algorithmic expressibility of the central model without any trusted data collector like in the local model and demonstrates Cryptε's practical feasibility with extensive empirical evaluations on real world datasets.
Ryoan: A Distributed Sandbox for Untrusted Computation on Secret Data
Ryoan provides a distributed sandbox, leveraging hardware enclaves to protect sandbox instances from potentially malicious computing platforms and is designed for a request-oriented data model, where confined modules only process input once and do not persist state about the input.
Honeycrisp: large-scale differentially private aggregation without a trusted core
A system called Honeycrisp is described, whose privacy cost depends on how often the data changes, and not on howoften a query is asked, and which can answer periodic queries for many years, as long as the underlying data does not change too often.
Opaque: An Oblivious and Encrypted Distributed Analytics Platform
The proposed Opaque introduces new distributed oblivious relational operators that hide access patterns, and new query planning techniques to optimize these new operators to improve performance.
ShrinkWrap: Efficient SQL Query Processing in Differentially Private Data Federations
Shrinkwrap is introduced, a private data federation that offers data owners a differentially private view of the data held by others to improve their performance over oblivious query processing and provides a trade-off between result accuracy and query evaluation performance.
Locally Differentially Private Protocols for Frequency Estimation
This paper introduces a framework that generalizes several LDP protocols proposed in the literature and yields a simple and fast aggregation algorithm, whose accuracy can be precisely analyzed, resulting in two new protocols that provide better utility than protocols previously proposed.
The Algorithmic Foundations of Differential Privacy
The preponderance of this monograph is devoted to fundamental techniques for achieving differential privacy, and application of these techniques in creative combinations, using the query-release problem as an ongoing example.
VC3: Trustworthy Data Analytics in the Cloud Using SGX
We present VC3, the first system that allows users to run distributed MapReduce computations in the cloud while keeping their code and data secret, and ensuring the correctness and completeness of
Ironclad Apps: End-to-End Security via Automated Full-System Verification
This work provides complete, low-level software verification of a full stack of verified software, which includes a verified kernel; verified drivers; verified system and crypto libraries including SHA, HMAC, and RSA; and four Ironclad Apps.
Calibrating Noise to Sensitivity in Private Data Analysis
The study is extended to general functions f, proving that privacy can be preserved by calibrating the standard deviation of the noise according to the sensitivity of the function f, which is the amount that any single argument to f can change its output.