Deep Energy-Based NARX Models

@article{Hendriks2021DeepEN,
  title={Deep Energy-Based NARX Models},
  author={Johannes N. Hendriks and Fredrik K. Gustafsson and Ant{\^o}nio H. Ribeiro and Adrian G. Wills and Thomas Bo Sch{\"o}n},
  journal={ArXiv},
  year={2021},
  volume={abs/2012.04136}
}

Figures and Tables from this paper

An energy-based deep splitting method for the nonlinear filtering problem

A deep splitting method is combined with an energy-based model for the approximation of functions by a deep neural network to result in a computationally fast nonlinear learner that takes observations as input and that does not require re-training when new observations are received.

Learning Proposals for Practical Energy-Based Regression

A conceptually simple method is introduced to automatically learn an effective proposal distribution, which is parameterized by a separate network head, leading to a unified training objective that jointly minimizes the KL divergence from the proposal to the EBM, and the negative log-likelihood of the E BM.

Variational message passing for online polynomial NARMAX identification

It is shown empirically that the variational Bayesian estimator outperforms an online recursive least-Squares estimator, most notably in small sample size settings and low noise regimes, and performs on par with an iterative least-squares estimators trained offline.

References

SHOWING 1-10 OF 21 REFERENCES

Implicit Generation and Modeling with Energy Based Models

This work presents techniques to scale MCMC based EBM training on continuous neural networks, and shows its success on the high-dimensional data domains of ImageNet32x32, ImageNet128x128, CIFAR-10, and robotic hand trajectories, achieving better samples than other likelihood models and nearing the performance of contemporary GAN approaches.

Energy-Based Models for Deep Probabilistic Regression

This work creates an energy-based model of the conditional target density p(y|x), using a deep neural network to predict the un-normalized density from (x, y), which is trained by directly minimizing the associated negative log-likelihood, approximated using Monte Carlo sampling.

Energy-Based Models for Sparse Overcomplete Representations

A new way of extending independent components analysis (ICA) to overcomplete representations that defines features as deterministic (linear) functions of the inputs and assigns energies to the features through the Boltzmann distribution.

Deep Convolutional Networks in System Identification

This paper establishes connections between the deep learning and the system identification communities and explores the explicit relationships between the recently proposed temporal convolutional network (TCN) and two classic system identification model structures; Volterra series and block-oriented models.

Deep State Space Models for Nonlinear System Identification

Your Classifier is Secretly an Energy Based Model and You Should Treat it Like One

This approach is the first to achieve performance rivaling the state-of-the-art in both generative and discriminative learning within one hybrid model.

A Tutorial on Energy-Based Learning

The EBM approach provides a common theoretical framework for many learning models, including traditional discr iminative and generative approaches, as well as graph-transformer networks, co nditional random fields, maximum margin Markov networks, and several manifold learning methods.

Noise-contrastive estimation: A new estimation principle for unnormalized statistical models

A new estimation principle is presented to perform nonlinear logistic regression to discriminate between the observed data and some artificially generated noise, using the model log-density function in the regression nonlinearity, which leads to a consistent (convergent) estimator of the parameters.

Deep Learning

Deep learning is making major advances in solving problems that have resisted the best attempts of the artificial intelligence community for many years, and will have many more successes in the near future because it requires very little engineering by hand and can easily take advantage of increases in the amount of available computation and data.

Probabilistic Regression for Visual Tracking

This work proposes a probabilistic regression formulation and applies it to tracking, which is capable of modeling label noise stemming from inaccurate annotations and ambiguities in the task and substantially improves the performance.