- Published 1999 in IDA

A method for the linear discrimination of two classes has been proposed by us in [3]. It searches for the discriminant direction which maximizes the distance between the projected class-conditional densities. It is a nonparametric method in the sense that the densities are estimated from the data. Since the distance between the projected densities is a highly nonlinear function with respect to the projected direction we maximize the objective function by an iterative optimization algorithm. The solution of this algorithm depends strongly on the starting point of the optimizer and the observed maximum can be merely a local maximum. In [3] we proposed a procedure for recursive optimization which searches for several local maxima of the objective function ensuring that a maximum already found will not be chosen again at a later stage. In this paper we re ne this method. We propose a procedure which provides a batch mode optimization instead an interactive optimization employed in [3]. By means of a simulation we compare our procedure and the conventional optimization starting optimizers at random. The results obtained con rm the e cacy of our method.

@inproceedings{Aladjem1999NonparametricLD,
title={Nonparametric Linear Discriminant Analysis by Recursive Optimization with Random Initialization},
author={Mayer Aladjem},
booktitle={IDA},
year={1999}
}