Principal component extraction using recursive least squares learning

@article{Bannour1995PrincipalCE,
  title={Principal component extraction using recursive least squares learning},
  author={Sami Bannour and Mahmood R. Azimi-Sadjadi},
  journal={IEEE transactions on neural networks},
  year={1995},
  volume={6 2},
  pages={
          457-69
        }
}
A new neural network-based approach is introduced for recursive computation of the principal components of a stationary vector stochastic process. The neurons of a single-layer network are sequentially trained using a recursive least squares squares (RLS) type algorithm to extract the principal components of the input process. The optimality criterion is based on retaining the maximum information contained in the input sequence so as to be able to reconstruct the network inputs from the… Expand
Fast recursive least squares learning algorithm for principal component analysis
TLDR
It is shown that all the information needed for PCA can be completely represented by the unnormalized weight vector which is updated based only on the corresponding neuron input-output product. Expand
Robust recursive least squares learning algorithm for principal component analysis
TLDR
It is shown that all information needed for PCA can be completely represented by the unnormalized weight vector which is updated based only on the corresponding neuron input-output product. Expand
Fast principal component extraction by a homogeneous neural network
  • S. Ouyang, Z. Bao
  • Computer Science
  • 2001 IEEE International Conference on Acoustics, Speech, and Signal Processing. Proceedings (Cat. No.01CH37221)
  • 2001
TLDR
Two adaptive algorithms based on the WINC for extracting in parallel multiple principal components are developed and they are able to provide an adaptive step size which leads to a significant improvement in the learning performance. Expand
Adaptive learning algorithm for principal component analysis with partial data
In this paper a fast and ecient adaptive learning algorithm for estimation of the principal components is developed. It seems to be especially useful in applications with changing environment , whereExpand
Combining PCA and MCA by using recursive least square learning method
  • A.S.Y. Wong, Kwok-Wo Wong, C. Leung
  • Computer Science, Mathematics
  • 1998 IEEE International Conference on Electronics, Circuits and Systems. Surfing the Waves of Science and Technology (Cat. No.98EX196)
  • 1998
TLDR
Simulation results show that both the convergent speed and the compression ratio are improved and indicate that the method combines the extraction of principal components and the pruning of minor components effectively. Expand
Fast principal component extraction by a weighted information criterion
  • S. Ouyang, Z. Bao
  • Computer Science, Mathematics
  • IEEE Trans. Signal Process.
  • 2002
TLDR
A weighted information criterion (WINC) for searching the optimal solution of a linear neural network, and analytically shows that the optimum weights globally asymptotically converge to the principal eigenvectors of a stationary vector stochastic process. Expand
Using recursive least square learning method for principal and minor components analysis
  • A.S.Y. Wong, Kwok-Wo Wong, A. Leung
  • Mathematics, Computer Science
  • Proceedings of the 1998 IEEE International Conference on Acoustics, Speech and Signal Processing, ICASSP '98 (Cat. No.98CH36181)
  • 1998
TLDR
Simulation results show that both the convergent speed and the compression ratio are improved, which indicate that the parallel extraction method effectively combines the extraction of the principal components and the pruning of the minor components. Expand
Principal component analysis of multispectral images using neural network
TLDR
A neural network model is proposed that performs the PCA directly from the original spectral images without any additional non-neuronal computations or preliminary matrix estimation and results show that the model performs well. Expand
Recursive algorithms for principal component extraction
Two new on-line recursive algorithms, namely, the Jacobi recursive principal component algorithm (JRPCA) and the Gauss–Seidel recursive principal component algorithm (GRPCA), are introduced for theExpand
Image compression using principal component neural networks
TLDR
The conclusion of the wide comparison among eight principal component networks is that the cascade recursive least-squares algorithm by Ci-chocki, Kasprzak and Skarbek exhibits the best numerical and structural properties. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 17 REFERENCES
Principal component extraction using recursive least squares learning method
A new approach is introduced for the recursive computation of the principal components of a vector stochastic process. The neurons of a single layer perceptron are sequentially trained using aExpand
An adaptive approach for optimal data reduction using recursive least squares learning method
  • S. Bannour, M. Azimi-Sadjadi
  • Mathematics, Computer Science
  • [Proceedings] ICASSP-92: 1992 IEEE International Conference on Acoustics, Speech, and Signal Processing
  • 1992
An approach is introduced for the recursive computation of the principal components of a vector stochastic process. The neurons of a single-layer perceptron are sequentially trained using a recursiveExpand
A neural network learning algorithm for adaptive principal component extraction (APEX)
  • S. Kung, K. Diamantaras
  • Computer Science
  • International Conference on Acoustics, Speech, and Signal Processing
  • 1990
TLDR
An algorithm called APEX which can recursively compute the principal components of a vector stochastic process using a linear neural network is proposed, and its computational advantages over previously proposed methods are demonstrated. Expand
Neural networks and principal component analysis: Learning from examples without local minima
TLDR
The main result is a complete description of the landscape attached to E in terms of principal component analysis, showing that E has a unique minimum corresponding to the projection onto the subspace generated by the first principal vectors of a covariance matrix associated with the training patterns. Expand
Optimal unsupervised learning in a single-layer linear feedforward neural network
TLDR
An optimality principle is proposed which is based upon preserving maximal information in the output units and an algorithm for unsupervised learning based upon a Hebbian learning rule, which achieves the desired optimality is presented. Expand
Adaptive network for optimal linear feature extraction
  • P. Foldiak
  • Computer Science
  • International 1989 Joint Conference on Neural Networks
  • 1989
TLDR
A network of highly interconnected linear neuron-like processing units and a simple, local, unsupervised rule for the modification of connection strengths between these units are proposed, making the implementation of the network easier, faster, and biologically more plausible than rules depending on error propagation. Expand
Two-dimensional adaptive block Kalman filtering of SAR imagery
TLDR
Simulation results on several images are provided to indicate the effectiveness of the proposed 2-D adaptive block Kalman filtering method when used to remove the effects of speckle noise as well as those of the additive noise. Expand
A comparison of two eigen-networks
  • F. Palmieri, J. Zhu
  • Mathematics
  • IJCNN-91-Seattle International Joint Conference on Neural Networks
  • 1991
The authors compare two linear networks which project adaptively the input data points on their principal components. They rederive Sanger's algorithm as the result of a constrained optimizationExpand
Adaptive Filter Theory
Background and Overview. 1. Stochastic Processes and Models. 2. Wiener Filters. 3. Linear Prediction. 4. Method of Steepest Descent. 5. Least-Mean-Square Adaptive Filters. 6. NormalizedExpand
Simplified neuron model as a principal component analyzer
  • E. Oja
  • Mathematics, Medicine
  • Journal of mathematical biology
  • 1982
A simple linear neuron model with constrained Hebbian-type synaptic modification is analyzed and a new class of unconstrained learning rules is derived. It is shown that the model neuron tends toExpand
...
1
2
...