• Corpus ID: 15600291

A Neural Networks Approach to Predicting How Things Might Have Turned Out Had I Mustered the Nerve to Ask Barry Cottonfield to the Junior Prom Back in 1997

@article{Armstrong2017ANN,
  title={A Neural Networks Approach to Predicting How Things Might Have Turned Out Had I Mustered the Nerve to Ask Barry Cottonfield to the Junior Prom Back in 1997},
  author={Eve Armstrong},
  journal={arXiv: Popular Physics},
  year={2017}
}
  • Eve Armstrong
  • Published 30 March 2017
  • Physics, Biology
  • arXiv: Popular Physics
We use a feed-forward artificial neural network with back-propagation through a single hidden layer to predict Barry Cottonfield's likely reply to this author's invitation to the "Once Upon a Daydream" junior prom at the Conard High School gymnasium back in 1997. To examine the network's ability to generalize to such a situation beyond specific training scenarios, we use a L2 regularization term in the cost function and examine performance over a range of regularization strengths. In addition… 
2 Citations

Figures and Tables from this paper

Pranks and Astronomical Antics
Some scientists take themselves and their work very seriously. However, there are plenty of cases of humour being combined with science. Here I review some examples from the broad fields of physics
Science Spoofs, Physics Pranks and Astronomical Antics
Some scientists take themselves and their work very seriously. However, there are plenty of cases of humour being combined with science. Here I review some examples from the broad fields of physics

References

SHOWING 1-10 OF 27 REFERENCES
Random synaptic feedback weights support error backpropagation for deep learning
TLDR
A surprisingly simple mechanism that assigns blame by multiplying errors by even random synaptic weights is presented, which can transmit teaching signals across multiple layers of neurons and performs as effectively as backpropagation on a variety of tasks.
Deep Neural Decision Forests
TLDR
A novel approach that unifies classification trees with the representation learning functionality known from deep convolutional networks, by training them in an end-to-end manner by introducing a stochastic and differentiable decision tree model.
Neural Random Forests
TLDR
This work reformulates the random forest method of Breiman (2001) into a neural network setting, and proposes two new hybrid procedures that are called neural random forests, which both predictors exploit prior knowledge of regression trees for their architecture.
Artificial Neural Networks
  • A. Roli
  • Economics
    Lecture Notes in Computer Science
  • 1995
Artificial neural networks (ANNs) constitute a class of flexible nonlinear models designed to mimic biological neural systems. In this entry, we introduce ANN using familiar econometric terminology
Neural Decision Forests for Semantic Image Labelling
TLDR
This work introduces randomized Multi- Layer Perceptrons (rMLP) as new split nodes which are capable of learning non-linear, data-specific representations and taking advantage of them by finding optimal predictions for the emerging child nodes.
Neural-Based Approaches for Improving the Accuracy of Decision Trees
TLDR
A Neural Decision Tree (NDT) model is proposed that combines the neural network technologies and the traditional decision-tree learning capabilities to handle the complicated and real cases and can significantly improve the accuracy of C5.
Cross-entropy vs. squared error training: a theoretical and experimental comparison
TLDR
It is found that with randomly initialized weights, the squared error based ANN does not converge to a good local optimum, and with a good initialization by pre-training, the word error rate of the best CE trained system could be reduced.
Neural Networks and Deep Learning
  • C. Aggarwal
  • Computer Science
    Springer International Publishing
  • 2018
Consolidated Guidance Counselors
  • Records
  • 1995
Order of Restraint: Armstrong, E. West Hartford; WHPD Records Division, Stalking: 18-9-111
  • C.R.S
  • 2002
...
...