Activation Landscapes as a Topological Summary of Neural Network Performance

@article{Wheeler2021ActivationLA,
  title={Activation Landscapes as a Topological Summary of Neural Network Performance},
  author={Matthew Wheeler and Jose J. Bouza and Peter Bubenik},
  journal={2021 IEEE International Conference on Big Data (Big Data)},
  year={2021},
  pages={3865-3870}
}
We use topological data analysis (TDA) to study how data transforms as it passes through successive layers of a deep neural network (DNN). We compute the persistent homology of the activation data for each layer of the network and summarize this information using persistence landscapes. The resulting feature map provides both an informative visualization of the network and a kernel for statistical analysis and machine learning. A statistical test shows that it correlates with classification… 

Figures and Tables from this paper

References

SHOWING 1-10 OF 53 REFERENCES
Neural Persistence: A Complexity Measure for Deep Neural Networks Using Algebraic Topology
TLDR
This work proposes neural persistence, a complexity measure for neural network architectures based on topological data analysis on weighted stratified graphs and derives a neural persistence-based stopping criterion that shortens the training process while achieving comparable accuracies as early stopping based on validation loss.
PersLay: A Neural Network Layer for Persistence Diagrams and New Graph Topological Signatures
TLDR
This work shows how graphs can be encoded by (extended) persistence diagrams in a provably stable way and proposes a general and versatile framework for learning vectorizations of persistence diagrams, which encompasses most of the vectorization techniques used in the literature.
On Characterizing the Capacity of Neural Networks using Algebraic Topology
TLDR
This paper reframe the problem of architecture selection as understanding how data determines the most expressive and generalizable architectures suited to that data, beyond inductive bias, and provides the first empirical characterization of the topological capacity of neural networks.
Applying Topological Persistence in Convolutional Neural Network for Music Audio Signals
TLDR
This paper proposes to embed the so-called "persistence landscape," a rather new topological summary for data, into a convolutional neural network (CNN) for dealing with audio signals, and shows that the resulting persistent Convolutional Neural Network (PCNN) model can perform significantly better than state-of-the-art models in prediction accuracy.
Topological Measurement of Deep Neural Networks Using Persistent Homology
TLDR
A novel approach to investigate the inner representation of DNNs through topological data analysis was proposed, which constructed simplicial complexes on DNN's based on deep Taylor decomposition and calculated the persistent homology (PH) ofDNNs.
Exposition and Interpretation of the Topology of Neural Networks
TLDR
Topological data analysis is used to show that the information encoded in the weights of a CNN can be organized in terms of a topological data model and how such information can be interpreted and utilized and that topological Information can be used to improve a network's performance.
Deep Learning with Topological Signatures
TLDR
This work proposes a technique that enables us to input topological signatures to deep neural networks and learn a task-optimal representation during training, realized as a novel input layer with favorable theoretical properties.
Connectivity-Optimized Representation Learning via Persistent Homology
TLDR
This work controls the connectivity of an autoencoder's latent space via a novel type of loss, operating on information from persistent homology, which is differentiable and presents a theoretical analysis of the properties induced by the loss.
Statistical topological data analysis using persistence landscapes
  • Peter Bubenik
  • Mathematics, Computer Science
    J. Mach. Learn. Res.
  • 2015
TLDR
A new topological summary for data that is easy to combine with tools from statistics and machine learning and obeys a strong law of large numbers and a central limit theorem is defined.
Topological Approaches to Deep Learning
TLDR
An algebraic formalism is introduced to describe and construct deep learning architectures as well as actions on them and it is demonstrated how these techniques can improve the transparency and performance of deep neural networks.
...
1
2
3
4
5
...