A weighted information criterion for multiple minor components and its adaptive extraction algorithms

Abstract

Minor component (MC) plays an important role in signal processing and data analysis, so it is a valuable work to develop MC extraction algorithms. Based on the concepts of weighted subspace and optimum theory, a weighted information criterion is proposed for searching the optimum solution of a linear neural network. This information criterion exhibits a unique global minimum attained if and only if the state matrix is composed of the desired MCs of an autocorrelation matrix of an input signal. By using gradient ascent method and recursive least square (RLS) method, two algorithms are developed for multiple MCs extraction. The global convergences of the proposed algorithms are also analyzed by the Lyapunov method. The proposed algorithms can extract the multiple MCs in parallel and has advantage in dealing with high dimension matrices. Since the weighted matrix does not require an accurate value, it facilitates the system design of the proposed algorithms for practical applications. The speed and computation advantages of the proposed algorithms are verified through simulations.

DOI: 10.1016/j.neunet.2017.02.006

Cite this paper

@article{Gao2017AWI, title={A weighted information criterion for multiple minor components and its adaptive extraction algorithms}, author={Yingbin Gao and Xiangyu Kong and Huihui Zhang and Li'an Hou}, journal={Neural networks : the official journal of the International Neural Network Society}, year={2017}, volume={89}, pages={1-10} }