Noboru Murata

Learn More
In this paper we introduce a new technique for blind source separation of speech signals. We focus on the temporal structure of the signals in contrast to most other major approaches to this problem. The idea is to apply the decorrelation method proposed by Molgedey and Schuster in the time-frequency domain. We show some results of experiments with both(More)
The problem of model selection, or determination of the number of hidden units, can be approached statistically, by generalizing Akaike's information criterion (AIC) to be applicable to unfaithful (i.e., unrealizable) models with general loss criteria including regularization terms. The relation between the training error and the generalization error is(More)
A statistical theory for overtraining is proposed. The analysis treats general realizable stochastic neural networks, trained with Kullback-Leibler divergence in the asymptotic case of a large number of training examples. It is shown that the asymptotic gain in the generalization error is small if we perform early stopping, even if we have access to the(More)
In this paper we examine on-line learning with statistical framework. Firstly we study the cases with xed and annealed learning rate. It can be shown that on-line learning with 1=t annealed learning rate minimizes the generalization error with the same rate as batch learning in the asymp-totic regime, that is, on-line learning can be as eeective as batch(More)
We propose a method of ICA for separating convolutive mixtures of acoustic signals. The acoustic signals recorded in a real environment are not instantaneous but convolutive mixtures, because of the delay and the reflections. In order to separate these signals, it is effective to transform the signals into time-frequency domain. The difficult point in these(More)
We aim at an extension of AdaBoost to U-Boost, in the paradigm to build a stronger classification machine from a set of weak learning machines. A geometric understanding of the Bregman divergence defined by a generic convex function U leads to the U-Boost method in the framework of information geometry extended to the space of the finite measures over a(More)
We propose a robust approach for independent component analysis (ICA) of signals where observations are contaminated with high-level additive noise and/or outliers. The source signals may contain mixtures of both sub-Gaussian and super-Gaussian components, and the number of sources is unknown. Our robust approach includes two procedures. In the first(More)
An adaptive on-line algorithm extending the learning of learning idea is proposed and theoretically motivated. Relying only on gradient flow information it can be applied to learning continuous functions or distributions, even when no explicit loss function is given and the Hessian is not available. Its efficiency is demonstrated for a non-stationary blind(More)