# Twin Neural Network Regression

@article{Wetzel2022TwinNN, title={Twin Neural Network Regression}, author={Sebastian Johann Wetzel and Kevin Ryczko and Roger G. Melko and Isaac Tamblyn}, journal={ArXiv}, year={2022}, volume={abs/2012.14873} }

We introduce twin neural network (TNN) regression. This method predicts differences between the target values of two different data points rather than the targets themselves. The solution of a traditional regression problem is then obtained by averaging over an ensemble of all predicted differences between the targets of an unseen data point and all training data points. Whereas ensembles are normally costly to produce, TNN regression intrinsically creates an ensemble of predictions of twice…

## 2 Citations

### Twin neural network regression is a semi-supervised regression algorithm

- Computer ScienceMachine Learning: Science and Technology
- 2022

Semi-supervised training of twin neural network regression improves TNNR performance, which is already state of the art, significantly.

### Applying the Case Difference Heuristic to Learn Adaptations from Deep Network Features

- Computer ScienceArXiv
- 2021

This paper investigates a two-phase process combining deep learning for feature extraction and neural network based adaptation learning from extracted features and shows that the combined process can successfully learn adaptation knowledge applicable to nonsymbolic differences in cases.

## References

SHOWING 1-10 OF 55 REFERENCES

### Snapshot Ensembles: Train 1, get M for free

- Computer ScienceICLR
- 2017

This paper proposes a method to obtain the seemingly contradictory goal of ensembling multiple neural networks at no additional training cost by training a single neural network, converging to several local minima along its optimization path and saving the model parameters.

### Neural Network Ensembles, Cross Validation, and Active Learning

- Computer ScienceNIPS
- 1994

It is shown how to estimate the optimal weights of the ensemble members using unlabeled data and how the ambiguity can be used to select new training data to be labeled in an active learning scheme.

### Learning with Pseudo-Ensembles

- Computer ScienceNIPS
- 2014

A novel regularizer based on making the behavior of a pseudo-ensemble robust with respect to the noise process generating it is presented, which naturally extends to the semi-supervised setting, where it produces state-of-the-art results.

### Addressing uncertainty in atomistic machine learning.

- Computer SciencePhysical chemistry chemical physics : PCCP
- 2017

This work addresses the types of errors that might arise in atomistic machine learning, the unique aspects of atomistic simulations that make machine-learning challenging, and highlights how uncertainty analysis can be used to assess the validity of machine- learning predictions.

### A quantitative uncertainty metric controls error in neural network-driven chemical discovery.

- Computer ScienceChemical science
- 2019

Tightening latent distance cutoffs systematically drives down predicted model errors below training errors, thus enabling predictive error control in chemical discovery or identification of useful data points for active learning.

### Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning

- Computer ScienceICML
- 2016

A new theoretical framework is developed casting dropout training in deep neural networks (NNs) as approximate Bayesian inference in deep Gaussian processes, which mitigates the problem of representing uncertainty in deep learning without sacrificing either computational complexity or test accuracy.

### Siamese Neural Networks for One-Shot Image Recognition

- Computer Science
- 2015

A method for learning siamese neural networks which employ a unique structure to naturally rank similarity between inputs and is able to achieve strong results which exceed those of other deep learning models with near state-of-the-art performance on one-shot classification tasks.

### Fast and Accurate Uncertainty Estimation in Chemical Machine Learning.

- Computer ScienceJournal of chemical theory and computation
- 2019

An inexpensive and reliable estimate of the uncertainty associated with the predictions of a machine-learning model of atomic and molecular properties is presented, based on resampling, with multiple models being generated based on subsampling of the same training data.

### Hydra: Preserving Ensemble Diversity for Model Distillation

- Computer Science, Environmental ScienceArXiv
- 2020

This work proposes a distillation method based on a single multi-headed neural network that improves distillation performance on classification and regression settings while capturing the uncertainty behaviour of the original ensemble over both in-domain and out-of-distribution tasks.

### Deep learning and density-functional theory

- PhysicsPhysical Review A
- 2019

We show that deep neural networks can be integrated into, or fully replace, the Kohn-Sham density functional theory (DFT) scheme for multielectron systems in simple harmonic oscillator and random…