Combining Multiple K-Nearest Neighbor Classifiers for Text Classification by Reducts

@inproceedings{Bao2002CombiningMK,
  title={Combining Multiple K-Nearest Neighbor Classifiers for Text Classification by Reducts},
  author={Yongguang Bao and Naohiro Ishii},
  booktitle={Discovery Science},
  year={2002}
}
The basic k-nearest neighbor classifier works well in text classification. However, improving performance of the classifier is still attractive. Combining multiple classifiers is an effective technique for improving accuracy. There are many general combining algorithms, such as Bagging, or Boosting that significantly improve the classifier such as decision trees, rule learners, or neural networks. Unfortunately, these combining methods do not improve the nearest neighbor classifiers. In this… CONTINUE READING
Highly Cited
This paper has 33 citations. REVIEW CITATIONS
21 Citations
10 References
Similar Papers

Citations

Publications citing this paper.
Showing 1-10 of 21 extracted citations

References

Publications referenced by this paper.
Showing 1-10 of 10 references

An Evaluation of Statistical Approaches to Text Classification

  • Y. Yang
  • Journal of Infor-mation Retrieval,
  • 1999
1 Excerpt

Combining Multiple k - Nearest Neighbor Classifiers Using Feature Combinations ”

  • S. Kaneko Itqon, S. Igarashi
  • Bay , “ Combining Nearest Neighbor Classifiers…
  • 1999

A . Skowron & C . Rauszer , “ The Discernibility Matrices and Functions in Information Systems ”

  • R. Slowinski
  • Intelligent Decision Support - Handbook of…
  • 1992

The Discernibility Matrices and Functions in Information Systems”, in R. Slowinski (ed.) Intelligent Decision Support - Handbook of Application and Advances of Rough Sets

  • A. Skowron, C. Rauszer
  • 1992

Joachims , “ Text Classification with Support Vector Machines : Learning with Many Relevant Features

  • D. Dipasquo M. Craven, D. Freitag, A. McCallum, T. Mitchell, K. Nigam, S. Slattery

Similar Papers

Loading similar papers…