An Application of the Principle of Maximum Information Preservation to Linear Systems
@inproceedings{Linsker1988AnAO, title={An Application of the Principle of Maximum Information Preservation to Linear Systems}, author={Ralph Linsker}, booktitle={NIPS}, year={1988} }
This paper addresses the problem of determining the weights for a set of linear filters (model "cells") so as to maximize the ensemble-averaged information that the cells' output values jointly convey about their input values, given the statistical properties of the ensemble of input vectors. The quantity that is maximized is the Shannon information rate, or equivalently the average mutual information between input and output. Several models for the role of processing noise are analyzed, and…
192 Citations
Local Synaptic Learning Rules Suffice to Maximize Mutual Information in a Linear Network
- Computer ScienceNeural Computation
- 1992
A local synaptic Learning rule is described that performs stochastic gradient ascent in this information-theoretic quantity, for the case in which the input-output mapping is linear and the input signal and noise are multivariate gaussian.
An Information-Maximization Approach to Blind Separation and Blind Deconvolution
- Computer ScienceNeural Computation
- 1995
It is suggested that information maximization provides a unifying framework for problems in "blind" signal processing and dependencies of information transfer on time delays are derived.
Information Rate Maximization over a Resistive Grid
- Computer ScienceThe 2006 IEEE International Joint Conference on Neural Network Proceedings
- 2006
The work considers the simplest case of optimizing a resistive grid such that the Shannon information rate across the input-output boundaries of the grid is maximized and their relation to principal subspace analysis (PSA).
Redundancy reduction as the basis for visual signal processing
- Computer ScienceDefense, Security, and Sensing
- 1992
An environmentally driven, self-organizing principle for encoding sensory messages is proposed, based on the need to learn their statistical properties, and is demonstrated by using it to efficiently learn, without supervision, the statistics of English text.
Redundancy Reduction as a Strategy for Unsupervised Learning
- Computer ScienceNeural Computation
- 1993
A local feature measure determining how much a single feature reduces the total redundancy is derived which turns out to depend only on the probability of the feature and of its components, but not on the statistical properties of any other features.
Blind signal processing by the adaptive activation function neurons
- Computer ScienceNeural Networks
- 2000
Neural Decision Boundaries for Maximal Information Transmission
- Computer SciencePloS one
- 2007
In a small noise limit, a general equation is derived for the decision boundary that locally relates its curvature to the probability distribution of inputs, and it is shown that for Gaussian inputs the optimal boundaries are planar, but for non–Gaussia inputs the curvature is nonzero.
Chapter 7 Information-Theoretic Learning
- Computer Science
- 1999
The chapter presents the current efforts to develop ITL criteria based on the integration of nonparametric density estimators with Renyi's quadratic entropy definition and proposes an IT criterion to minimize or maximize mutual information based on an approximation of the Kullback-Leibler divergence measure.
Entropy optimization by the PFANN network: application to blind source separation.
- Computer ScienceNetwork
- 1999
Comparisons of the performance of the proposed blind separation technique with those exhibited by existing methods show that the PFANN approach gives similar performance with a noticeable reduction in computational effort.
Energy, entropy and information potential for neural computation
- Computer Science
- 1998
The major goal of this research is to develop general nonparametric methods for the estimation of entropy and mutual information, giving a unifying point of view for their use in signal processing…
References
SHOWING 1-3 OF 3 REFERENCES
Self-organization in a perceptual network
- Computer ScienceComputer
- 1988
It is shown that even a single developing cell of a layered network exhibits a remarkable set of optimization properties that are closely related to issues in statistics, theoretical physics, adaptive signal processing, the formation of knowledge representation in artificial intelligence, and information theory.
Rate Distortion Theory (Prentice-Hall
- Englewood Cliffs, N.J.,
- 1971