# Ranking Deep Learning Generalization using Label Variation in Latent Geometry Graphs

@article{Lassance2020RankingDL, title={Ranking Deep Learning Generalization using Label Variation in Latent Geometry Graphs}, author={C. Lassance and Louis B{\'e}thune and Myriam Bontonou and Mounia Hamidouche and Vincent Gripon}, journal={ArXiv}, year={2020}, volume={abs/2011.12737} }

Measuring the generalization performance of a Deep Neural Network (DNN) without relying on a validation set is a difficult task. In this work, we propose exploiting Latent Geometry Graphs (LGGs) to represent the latent spaces of trained DNN architectures. Such graphs are obtained by connecting samples that yield similar latent representations at a given layer of the considered DNN. We then obtain a generalization score by looking at how strongly connected are samples of distinct classes in LGGs…

## 4 Citations

### Measuring Generalization with Optimal Transport

- Computer ScienceNeurIPS
- 2021

Understanding the generalization of deep neural networks is one of the most important tasks in deep learning. Although much progress has been made, theoretical error bounds still often behave…

### Towards explaining the generalization gap in neural networks using topological data analysis

- Computer ScienceArXiv
- 2022

This paper compares the usefulness of different numerical summaries from persistence diagrams and shows that a combination of some of them can accurately predict and partially explain the generalization gap without the need of a test set.

### Weight Expansion: A New Perspective on Dropout and Generalization

- Computer ScienceArXiv
- 2022

The concept of weight expansion, an increase in the signed volume of a parallelotope spanned by the column or row vectors of the weight covariance matrix, is introduced and it is shown that weight expansion is anective means of increasing the generalization in a PAC-Bayesian setting.

### On Predicting Generalization using GANs

- Computer ScienceICLR
- 2022

Can test error be predicted using synthetic data, produced using a Generative Adversarial Network (GAN) that was trained on the same training dataset, and this turns out to be the case.

## References

SHOWING 1-10 OF 12 REFERENCES

### Representing Deep Neural Networks Latent Space Geometries with Graphs

- Computer ScienceAlgorithms
- 2021

This work proposes to represent geometries by constructing similarity graphs from the intermediate representations obtained when processing a batch of inputs by constraining these Latent Geometry Graphs (LGGs), and demonstrates the ability of the proposed geometry-based methods in solving the considered problems.

### An Inside Look at Deep Neural Networks Using Graph Signal Processing

- Computer Science2018 Information Theory and Applications Workshop (ITA)
- 2018

Comparisons of metrics and measures are compared and it is shown that smoothness of label signals on k-nearest neighbor graphs are a good candidate to interpret individual layers role in achieving good performance.

### Fantastic Generalization Measures and Where to Find Them

- Computer ScienceICLR
- 2020

This work presents the first large scale study of generalization in deep networks, investigating more then 40 complexity measures taken from both theoretical bounds and empirical studies and showing surprising failures of some measures as well as promising measures for further research.

### Identity Mappings in Deep Residual Networks

- Computer ScienceECCV
- 2016

The propagation formulations behind the residual building blocks suggest that the forward and backward signals can be directly propagated from one block to any other block, when using identity mappings as the skip connections and after-addition activation.

### In Search of Robust Measures of Generalization

- Computer ScienceNeurIPS
- 2020

This work addresses the question of how to evaluate generalization bounds empirically and argues that generalization measures should instead be evaluated within the framework of distributional robustness.

### Graph Construction from Data by Non-Negative Kernel Regression

- Computer ScienceICASSP 2020 - 2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
- 2020

This paper proposes non-negative kernel regression (NNK), an improved approach for graph construction with interesting geometric and theoretical properties and demonstrates experimentally the efficiency of NNK graphs, their robustness to choice of sparsity K and show that they can outperform state of the art graph methods in semi supervised learning tasks.

### Graph Topology Inference Benchmarks for Machine Learning

- Computer Science2020 IEEE 30th International Workshop on Machine Learning for Signal Processing (MLSP)
- 2020

This work introduces several ease-to-use and publicly released benchmarks specifically designed to reveal the relative merits and limitations of graph inference methods.

### mixup: Beyond Empirical Risk Minimization

- Computer ScienceICLR
- 2018

This work proposes mixup, a simple learning principle that trains a neural network on convex combinations of pairs of examples and their labels, which improves the generalization of state-of-the-art neural network architectures.

### Predicting the Accuracy of a Few-Shot Classifier

- Computer ScienceArXiv
- 2020

This paper analyzes the reasons for the variability of generalization performances, and proposes reasonable measures that empirically demonstrate to be correlated with the generalization ability of considered classifiers.

### Graph Vertex Sampling with Arbitrary Graph Signal Hilbert Spaces

- Mathematics, Computer ScienceICASSP 2020 - 2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
- 2020

This work first state how the change of inner product impacts sampling set selection and reconstruction, and then applies it in the context of geometric graphs to highlight how choosing an alternative inner product matrix can help sampling setselection and reconstruction.