• Publications
  • Influence
Advances and Open Problems in Federated Learning
TLDR
Federated learning (FL) is a machine learning setting where many clients (e.g. mobile devices or whole organizations) collaboratively train a model under the orchestration of a central server, while keeping the training data decentralized. Expand
The Privacy Blanket of the Shuffle Model
TLDR
We provide an optimal single message protocol for summation of real numbers in the shuffle model. Expand
Privacy-Preserving Distributed Linear Regression on High-Dimensional Data
TLDR
We propose privacy-preserving protocols for computing linear regression models, in the setting where the training dataset is vertically distributed among several parties. Expand
Reverse Engineering Digital Circuits Using Structural and Functional Analyses
TLDR
We present a set of algorithms for the reverse engineering of digital circuits starting from an unstructured netlist and resulting in a high-level netlist with components such as register files, counters, adders, and subtractors. Expand
Revisiting Square-Root ORAM: Efficient Random Access in Multi-party Computation
TLDR
Hiding memory access patterns is required for secure computation, but remains prohibitively expensive for many interesting applications. Expand
WordRev: Finding word-level structures in a sea of bit-level gates
TLDR
In this paper, we present a systemic way of automatically deriving word-level structures from the gate-level netlist of a digital circuit. Expand
TAPAS: Tricks to Accelerate (encrypted) Prediction As a Service
TLDR
We combine ideas from the machine learning literature, particularly work on binarization and sparsification of neural networks, together with algorithmic tools to speed-up and parallelize computation using encrypted data. Expand
Distributed Vector-OLE: Improved Constructions and Implementation
TLDR
We investigate concretely efficient protocols for distributed oblivious linear evaluation over vectors (Vector-OLE). Expand
QUOTIENT: Two-Party Secure Neural Network Training and Prediction
TLDR
We present QUOTIENT, a new method for discretized training of DNNs, along with a customized secure two-party protocol for it. Expand
Template-based circuit understanding
When verifying or reverse-engineering digital circuits, one often wants to identify and understand small components in a larger system. A possible approach is to show that the sub-circuit underExpand
...
1
2
3
4
5
...