• Publications
  • Influence
Federated Learning: Strategies for Improving Communication Efficiency
Two ways to reduce the uplink communication costs are proposed: structured updates, where the user directly learns an update from a restricted space parametrized using a smaller number of variables, e.g. either low-rank or a random mask; and sketched updates, which learn a full model update and then compress it using a combination of quantization, random rotations, and subsampling. Expand
Advances and Open Problems in Federated Learning
Motivated by the explosive growth in FL research, this paper discusses recent advances and presents an extensive collection of open problems and challenges. Expand
Circulant Binary Embedding
This work proposes Circulant Binary Embedding (CBE), which generates binary codes by projecting the data with a circulant matrix, and proposes a novel time-frequency alternating optimization to learn data-dependentcirculant projections, which alternatively minimizes the objective in original and Fourier domains. Expand
Designing Category-Level Attributes for Discriminative Visual Recognition
A novel formulation to automatically design discriminative "category-level attributes", which can be efficiently encoded by a compact category-attribute matrix, which allows to achieve intuitive and critical design criteria (category-separability, learn ability) in a principled way. Expand
Orthogonal Random Features
We present an intriguing discovery related to Random Fourier Features: replacing multiplication by a random Gaussian matrix with multiplication by a properly scaled random orthogonal matrixExpand
An Exploration of Parameter Redundancy in Deep Networks with Circulant Projections
We explore the redundancy of parameters in deep neural networks by replacing the conventional linear projection in fully-connected layers with the circulant projection. The circulant structureExpand
\(\propto\)SVM for Learning with Label Proportions
A new method is proposed, or $\propto$SVM, which explicitly models the latent unknown instance labels together with the known group label proportions in a large-margin framework, and outperforms the state-of-the-art, especially for larger group sizes. Expand
cpSGD: Communication-efficient and differentially-private distributed SGD
This work extends and improves previous analysis of the Binomial mechanism showing that it achieves nearly the same utility as the Gaussian mechanism, while requiring fewer representation bits, which can be of independent interest. Expand
Distributed Mean Estimation with Limited Communication
This work shows that applying a structured random rotation before quantization and a better coding strategy further reduces the error to O(1/n) and shows that the latter coding strategy is optimal up to a constant in the minimax sense i.e., it achieves the best MSE for a given communication cost. Expand
Learning Discriminative and Transformation Covariant Local Feature Detectors
This work extends the covariant constraint proposed by Lenc and Vedaldi by defining the concepts of standard patch and canonical feature and leverage these to train a novel robust covariant detector, and shows that this method outperforms previous hand-crafted and learning-based detectors by large margins in terms of repeatability. Expand