Converting Cascade-Correlation Neural Nets into Probabilistic Generative Models
@article{Nobandegani2017ConvertingCN, title={Converting Cascade-Correlation Neural Nets into Probabilistic Generative Models}, author={Ardavan Salehi Nobandegani and Thomas R. Shultz}, journal={ArXiv}, year={2017}, volume={abs/1701.05004} }
Humans are not only adept in recognizing what class an input instance belongs to (i.e., classification task), but perhaps more remarkably, they can imagine (i.e., generate) plausible instances of a desired class with ease, when prompted. Inspired by this, we propose a framework which allows transforming Cascade-Correlation Neural Networks (CCNNs) into probabilistic generative models, thereby enabling CCNNs to generate samples from a category of interest. CCNNs are a well-known class of…
2 Citations
A Resource-Rational Process-Level Account of the St. Petersburg Paradox
- EconomicsCogSci
- 2019
This work shows that Nobandegani et al.'s (2018) metacognitively rational model, sample-based expected utility (SbEU), can account for major experimental findings on this paradox and presents the first resource-rational, process-level explanation of this paradox.
A computational model of infant learning and reasoning with probabilities.
- Computer Science, PsychologyPsychological review
- 2021
A novel computational system called Neural Probability Learner and Sampler (NPLS) that learns and reasons with probabilities, providing a computationally sufficient mechanism to explain infant probabilistic learning and inference.
References
SHOWING 1-10 OF 33 REFERENCES
Reducing Network Depth in the Cascade-Correlation Learning Architecture,
- Computer Science
- 1994
This paper investigates a simple variation of Cascade-Correlation that will build deep nets if necessary, but that is biased toward minimizing network depth, and demonstrates empirically, across a range of problems, that this simple technique can reduce network depth.
Knowledge-based cascade-correlation: Using knowledge to speed learning
- Computer ScienceConnect. Sci.
- 2001
A new extension of a well-known generative algorithm, cascade-correlation, that recruits previously learned sub-networks as well as single hidden units and is observed to find, adapt and use its relevant knowledge to speed learning significantly.
Neural Dynamics as Sampling: A Model for Stochastic Computation in Recurrent Networks of Spiking Neurons
- Computer Science, BiologyPLoS Comput. Biol.
- 2011
A neural network model is proposed and it is shown by a rigorous theoretical analysis that its neural activity implements MCMC sampling of a given distribution, both for the case of discrete and continuous time.
The Cascade-Correlation Learning Architecture
- Computer ScienceNIPS
- 1989
The Cascade-Correlation architecture has several advantages over existing algorithms: it learns very quickly, the network determines its own size and topology, it retains the structures it has built even if the training set changes, and it requires no back-propagation of error signals through the connections of the network.
Probabilistic Inference in General Graphical Models through Sampling in Stochastic Networks of Spiking Neurons
- Computer Science, BiologyPLoS Comput. Biol.
- 2011
Ubiquitous stochastic features of networks of spiking neurons, such as trial-to-trial variability and spontaneous activity, are necessary ingredients of the underlying computational organization and can be scaled up to neural emulations of probabilistic inference in fairly large graphical models.
A Tutorial on Energy-Based Learning
- Computer Science
- 2006
The EBM approach provides a common theoretical framework for many learning models, including traditional discr iminative and generative approaches, as well as graph-transformer networks, co nditional random fields, maximum margin Markov networks, and several manifold learning methods.
Multistability and Perceptual Inference
- Computer ScienceNeural Computation
- 2012
It is argued that the visual system approximates the posterior over underlying causes with a set of samples and that this approximation strategy produces perceptual multistability—stochastic alternation between percepts in consciousness.
Rational approximations to rational models: alternative algorithms for category learning.
- Computer SciencePsychological review
- 2010
It is argued that Monte Carlo methods provide a source of rational process models that connect optimal solutions to psychological processes and is proposed that a particle filter with a single particle provides a good description of human inferences.
Neural networks discover a near-identity relation to distinguish simple syntactic forms
- Computer ScienceMinds and Machines
- 2006
Computer simulations show that an unstructured neural-network model covers the essential features of infant learning of simple grammars in an artificial language and uses this near-identity relation to distinguish sentences that are consistent or inconsistent with a familiar grammar.
Probabilistic models, learning algorithms, and response variability: sampling in cognitive development
- PsychologyTrends in Cognitive Sciences
- 2014