Regularisation of Linear Classifiers by Adding Redundant Features

@article{Skurichina1999RegularisationOL,
  title={Regularisation of Linear Classifiers by Adding Redundant Features},
  author={Marina Skurichina and Robert P. W. Duin},
  journal={Pattern Analysis & Applications},
  year={1999},
  volume={2},
  pages={44-52}
}
The Pseudo Fisher Linear Discriminant (PFLD) based on a pseudo-inverse technique shows a peaking behaviour of the generalisation error for training sample sizes that are about the feature size: with an increase in the training sample size, the generalisation error first decreases, reaching a minimum, then increases, reaching a maximum at the point where the training sample size is equal to the data dimensionality, and afterwards begins again to decrease. A number of ways exist to solve this… CONTINUE READING
Highly Cited
This paper has 20 citations. REVIEW CITATIONS

Citations

Publications citing this paper.
Showing 1-10 of 12 extracted citations

References

Publications referenced by this paper.
Showing 1-10 of 14 references

On expected classification error of the Fisher linear classifier with pseudoinverse covariance matrix

  • Š Raudys, Duin RPW
  • Pattern Recognition Letters
  • 1998

A neural network applied to spot counting

  • A Hoekstra, H Netten, D. de Ridder
  • Proceedings of ACSI’96, the Second Annual…
  • 1996
1 Excerpt

Optimal regularization of neural networks and ridge estimates of the covariance matrix in statistical classification

  • Š Raudys, M Skurichina, T Cibas, P Gallinari
  • Pattern Recognition and Image Analysis : Advances…
  • 1995

On expected classification error of the Fisher linear classifier with pseudo-inverse covariance matrix. Pattern Recognition Letters

  • Š Raudys, Duin RPW
  • Correspondence and offprint requests to: M…
  • 1978
1 Excerpt

Similar Papers

Loading similar papers…