A fundamental problem in neural network research, as well as in many other disciplines, is finding a suitable representation of multivariate data, i.e. random vectors.Expand

We introduce a family of new contrast (objective) functions for ICA using maximum entropy approximations of differential entropy, which enable both the estimation of the whole decomposition by minimizing mutual information and estimation of individual independent components as projection pursuit directions.Expand

In this chapter, we discuss a statistical generative model called independent component analysis. It is basically a proper probabilistic formulation of the ideas underpinning sparse coding. It shows… Expand

A common problem encountered in such disciplines as statistics, data analysis, signal processing, and neural network research, is nding a suitable representation of multivariate data that minimizes the statistical dependence of the components of the representation.Expand

We show how to discover the causal structure of continuous-valued data, under the assumptions that (a) the data generating process is linear, (b) there are no unobserved confounders, and (c) disturbance variables have non-Gaussian distributions of non-zero variances.Expand

A fast fixed-point type algorithm that is capable of separating complex valued, linearly mixed source signals and its computational efficiency is shown by simulations.Expand

learning, psychological motivated conditioning, error-correcting algorithms etc.). While the book certainly has a coherent perspective, and contains many interesting details useful also for… Expand