Corpus ID: 220496273

Transformations between deep neural networks

  title={Transformations between deep neural networks},
  author={Tom S. Bertalan and Felix Dietrich and I. Kevrekidis},
We propose to test, and when possible establish, an equivalence between two different artificial neural networks by attempting to construct a data-driven transformation between them, using manifold-learning techniques. In particular, we employ diffusion maps with a Mahalanobis-like metric. If the construction succeeds, the two networks can be thought of as belonging to the same equivalence class. We first discuss transformation functions between only the outputs of the two networks; we then… Expand

Figures and Tables from this paper


Generative Adversarial Networks
We propose a new framework for estimating generative models via an adversarial process, in which we simultaneously train two models: a generative model G that captures the data distribution, and aExpand
Intrinsic Isometric Manifold Learning with Application to Localization
This work builds a new metric and proposes a method for its robust estimation by assuming mild statistical priors and by using artificial neural networks as a mechanism for metric regularization and parametrization, and shows successful application to unsupervised indoor localization in ad-hoc sensor networks. Expand
Parsimonious Representation of Nonlinear Dynamical Systems Through Manifold Learning: A Chemotaxis Case Study
Nonlinear manifold learning algorithms, such as diffusion maps, have been fruitfully applied in recent years to the analysis of large and complex data sets. However, such algorithms still encounterExpand
A Geometric Approach to the Transport of Discontinuous Densities
Using ideas from attractor reconstruction in dynamical systems, it is demonstrated how additional information in the form of short histories of an observation process can help to recover the underlying manifold. Expand
Auto-Encoding Variational Bayes
A stochastic variational inference and learning algorithm that scales to large datasets and, under some mild differentiability conditions, even works in the intractable case is introduced. Expand
Wasserstein Generative Adversarial Networks
This work introduces a new algorithm named WGAN, an alternative to traditional GAN training that can improve the stability of learning, get rid of problems like mode collapse, and provide meaningful learning curves useful for debugging and hyperparameter searches. Expand
Nearly Isometric Embedding by Relaxation
An embedding algorithm that directly computes, for any data embedding Y, a distortion loss Y, and iteratively updates Y in order to decrease it, and the superiority of this algorithm in obtaining low distortion embeddings is confirmed. Expand
Manifold learning for organizing unstructured sets of process observations.
This paper uses manifold learning to organize unstructured ensembles of observations ("trials") of a system's response surface, and demonstrates how this observation-based reconstruction naturally leads to informative transport maps between the input parameter space and output/state variable spaces. Expand
Local Kernels and the Geometric Structure of Data
We introduce a theory of local kernels, which generalize the kernels used in the standard diffusion maps construction of nonparametric modeling. We prove that evaluating a local kernel on a data setExpand
Diffusion maps
In this paper, we provide a framework based upon diffusion processes for finding meaningful geometric descriptions of data sets. We show that eigenfunctions of Markov matrices can be used toExpand