• Publications
  • Influence
Making Deep Neural Networks Robust to Label Noise: A Loss Correction Approach
TLDR
It is proved that, when ReLU is the only non-linearity, the loss curvature is immune to class-dependent label noise, and it is shown how one can estimate these probabilities, adapting a recent technique for noise estimation to the multi-class setting, and providing an end-to-end framework.
Advances and Open Problems in Federated Learning
TLDR
Motivated by the explosive growth in FL research, this paper discusses recent advances and presents an extensive collection of open problems and challenges.
Statistical region merging
  • R. Nock, F. Nielsen
  • Computer Science
    IEEE Transactions on Pattern Analysis and Machine…
  • 1 November 2004
TLDR
A statistical basis for a process often described in computer vision: image segmentation by region merging following a particular order in the choice of regions is explored, leading to a fast segmentation algorithm tailored to processing images described using most common numerical pixel attribute spaces.
Private federated learning on vertically partitioned data via entity resolution and additively homomorphic encryption
TLDR
This work describes a three-party end-to-end solution in two phases ---privacy-preserving entity resolution and federated logistic regression over messages encrypted with an additively homomorphic scheme---, secure against a honest-but-curious adversary.
Bregman Voronoi Diagrams
TLDR
A framework for defining and building Voronoi diagrams for a broad class of distance functions called Bregman divergences, which allow one to define information-theoretic Vor onoi diagrams in statistical parametric spaces based on the relative entropy of distributions.
Sided and Symmetrized Bregman Centroids
  • F. Nielsen, R. Nock
  • Computer Science, Mathematics
    IEEE Transactions on Information Theory
  • 1 June 2009
TLDR
It is proved that all three centroids are unique and give closed-form solutions for the sided centroid that are generalized means, and a provably fast and efficient arbitrary close approximation algorithm for the symmetrized centroid based on its exact geometric characterization is designed.
Adaptive Subspaces for Few-Shot Learning
TLDR
This paper provides a framework for few-shot learning by introducing dynamic classifiers that are constructed from few samples and empirically shows that such modelling leads to robustness against perturbations and yields competitive results on the task of supervised and semi-supervised few- shot classification.
Making Neural Networks Robust to Label Noise: a Loss Correction Approach
TLDR
It is proved that, when ReLU is the only non-linearity, the loss curvature is immune to class-dependent label noise, and the proposed procedures for loss correction simply amount to at most a matrix inversion and multiplication.
Loss factorization, weakly supervised learning and label noise robustness
We prove that the empirical risk of most well-known loss functions factors into a linear term aggregating all labels with a term that is label free, and can further be expressed by sums of the loss.
On the chi square and higher-order chi distances for approximating f-divergences
TLDR
A closed-form formula for calculating the Chi square and higher-order Chi distances between statistical distributions belonging to the same exponential family with affine natural space and an analytic formula for the f-divergences based on Taylor expansions and relying on an extended class of Chi-type distances are reported.
...
...