Observer-participant models of neural processing
@article{Fry1995ObserverparticipantMO, title={Observer-participant models of neural processing}, author={Robert L. Fry}, journal={IEEE transactions on neural networks}, year={1995}, volume={6 4}, pages={ 918-28 }, url={https://api.semanticscholar.org/CorpusID:11816312} }
A model is proposed in which the neuron serves as an information channel and decisions are made regarding the validity of a question passively posed by the neuron, based on a sigmoidal transfer characteristic or a maximum likelihood decision rule.
30 Citations
RATIONAL NEURAL MODELS BASED ON INFORMATION THEORY
- 1996
Computer Science, Philosophy
Solutions take the form of the Hopfield neuron model with a requirement for Hebbian learning and a maximum mutual information formulation is shown to be fully constrained in this regard and can make exclusive use of locally available information.
Computation by neural and cortical systems
- 2008
Computer Science
Introduction A theory of computation is summarized which is being used to posit and solve the problems solved by pyramidal neurons and cortical systems. The theory is based on the premise that both…
Probability Density Methods for Smooth Function Approximation and Learning in Populations of Tuned Spiking Neurons
- 1998
Computer Science
Classical neural network approximation methods and learning algorithms based on continuous variables can be implemented within networks of spiking neurons without the need to make numerical estimates of the intermediate cell firing rates.
Information processing in dendrites: II. Information theoretic complexity
- 2001
Computer Science, Mathematics
Spiking neural networks for computer vision
- 2018
Computer Science, Biology
Here this approach is used to explore structural synaptic plasticity as a possible mechanism whereby biological vision systems may learn the statistics of their inputs without supervision, pointing the way to engineered vision systems with similar online learning capabilities.
Conjoint computational and morphological optimization by cortical neurons
- 2011
Computer Science
A simple and detailed model of the 4 phases of operation of the Carnot cycle of the cortical neuron is developed and the neural system entropy is shown to be exactly n+1 bits where n is the number of synaptic inputs and the increment 1 arises from the ability of the neuron to fire – or not.
12 References
Local Synaptic Learning Rules Suffice to Maximize Mutual Information in a Linear Network
- 1992
Computer Science
A local synaptic Learning rule is described that performs stochastic gradient ascent in this information-theoretic quantity, for the case in which the input-output mapping is linear and the input signal and noise are multivariate gaussian.
Towards an Organizing Principle for a Layered Perceptual Network
- 1987
Computer Science
An information-theoretic optimization principle is proposed for the development of each processing stage of a multilayered perceptual network that maximizes the information that the output signal values convey about the input signals values, subject to certain constraints and in the presence of processing noise.
Self-organization in a perceptual network
- 1988
Computer Science, Physics
It is shown that even a single developing cell of a layered network exhibits a remarkable set of optimization properties that are closely related to issues in statistics, theoretical physics, adaptive signal processing, the formation of knowledge representation in artificial intelligence, and information theory.
Maximum Entropy Connections: Neural Networks
- 1991
Computer Science
This connection between probabilistic inference and neural networks gives a viewpoint on the effective assumptions and approximations being made when a Hopfield network is used as an associative memory, and motivates several modifications to the original algorithms.
From basic network principles to neural architecture: emergence of spatial-opponent cells.
- 1986
Biology
This paper is the first of three that address the origin and organization of feature-analyzing cells in simple systems governed by biologically plausible development rules, and introduces the theory of "modular self-adaptive networks," of which this system is an example, and explicitly demonstrates the emergence of a layer of spatial-opponent cells.
Prior Probabilities
- 1968
Mathematics
It is shown that in many problems, including some of the most important in practice, this ambiguity can be removed by applying methods of group theoretical reasoning which have long been used in theoretical physics.
Neural networks and physical systems with emergent collective computational abilities.
- 1982
Computer Science, Physics
A model of a system having a large number of simple equivalent components, based on aspects of neurobiology but readily adapted to integrated circuits, produces a content-addressable memory which correctly yields an entire memory from any subpart of sufficient size.
Simplified neuron model as a principal component analyzer
- 1982
Mathematics, Computer Science
A simple linear neuron model with constrained Hebbian-type synaptic modification is analyzed and a new class of unconstrained learning rules is derived. It is shown that the model neuron tends to…
Perceptual neural organization: some approaches based on network models and information theory.
- 1990
Computer Science, Medicine
Article de synthese a propos de la modelisation de l'information et de l'expression des lois regissant les interactions entre les neurones sous formes d'algorithmes afin de comprendre l'organisation…
The Algebra of Probable Inference
- 1962
Mathematics
In Algebra of Probable Inference, Richard T. Cox develops and demonstrates that probability theory is the only theory of inductive inference that abides by logical consistency. Cox does so through a…