Learn More
529 learning, psychological motivated conditioning, error-correcting algorithms etc.). While the book certainly has a coherent perspective, and contains many interesting details useful also for educational purposes, still I remained somewhat disappointed. I don't believe that the 'unified ap-proach' presented here is unique, and probably neural structures(More)
Independent component analysis (ICA) is a statistical method for transforming an observed multidimensional random vector into components that are statistically as independent from each other as possible. In this paper, we use a combination of two different approaches for linear ICA: Comon's information-theoretic approach and the projection pursuit approach.(More)
We introduce a novel fast algorithm for Independent Component Analysis, which can be used for blind source separation and feature extraction. It is shown how a neural network learning rule can be transformed into a xed-point iteration, which provides an algorithm that is very simple, does not depend on any user-deened parameters, and is fast to converge to(More)
Recently, independent component analysis (ICA) has been widely used in the analysis of brain imaging data. An important problem with most ICA algorithms is, however, that they are stochastic; that is, their results may be somewhat different in different runs of the algorithm. Thus, the outputs of a single run of an ICA algorithm should be interpreted with(More)
Separation of complex valued signals is a frequently arising problem in signal processing. For example, separation of convolutively mixed source signals involves computations on complex valued signals. In this article, it is assumed that the original, complex valued source signals are mutually statistically independent, and the problem is solved by the(More)
A common problem encountered in such disciplines as statistics, data analysis, signal processing, and neural network research, is nding a suitable representation of multivariate data. For computational and conceptual simplicity, such a representation is often sought as a linear transformation of the original data. Well-known linear transformation methods(More)
In ordinary independent component analysis, the components are assumed to be completely independent, and they do not necessarily have any meaningful order relationships. In practice, however, the estimated "independent" components are often not at all independent. We propose that this residual dependence structure could be used to define a topographic order(More)
Olshausen and Field (1996) applied the principle of independence maximization by sparse coding to extract features from natural images. This leads to the emergence of oriented linear filters that have simultaneous localization in space and in frequency, thus resembling Gabor functions and simple cell receptive fields. In this article, we show that the same(More)
Sparse coding is a method for nding a representation of data in which each of the components of the representation is only rarely signiicantly active. Such a representation is closely related to redundancy reduction and independent component analysis, and has some neurophysiological plausibility. In this paper, we show how sparse coding can be used for(More)