Optimising Kernel Parameters and Regularisation Coefficients for Non-linear Discriminant Analysis


In this paper we consider a novel Bayesian interpretation of Fisher’s discriminant analysis. We relate Rayleigh’s coefficient to a noise model that minimises a cost based on the most probable class centres and that abandons the ‘regression to the labels’ assumption used by other algorithms. Optimisation of the noise model yields a direction of discrimination equivalent to Fisher’s discriminant, and with the incorporation of a prior we can apply Bayes’ rule to infer the posterior distribution of the direction of discrimination. Nonetheless, we argue that an additional constraining distribution has to be included if sensible results are to be obtained. Going further, with the use of a Gaussian process prior we show the equivalence of our model to a regularised kernel Fisher’s discriminant. A key advantage of our approach is the facility to determine kernel parameters and the regularisation coefficient through the optimisation of the marginal log-likelihood of the data. An added bonus of the new formulation is that it enables us to link the regularisation coefficient with the generalisation error.

Extracted Key Phrases

13 Figures and Tables


Citations per Year

72 Citations

Semantic Scholar estimates that this publication has 72 citations based on the available data.

See our FAQ for additional information.

Cite this paper

@article{Centeno2006OptimisingKP, title={Optimising Kernel Parameters and Regularisation Coefficients for Non-linear Discriminant Analysis}, author={Tonatiuh Pe{\~n}a Centeno and Neil D. Lawrence}, journal={Journal of Machine Learning Research}, year={2006}, volume={7}, pages={455-491} }