Bagging KNN Classifiers using Different Expert Fusion Strategies


An experimental evaluation of Bagging K-nearest neighbor classifiers (KNN) is performed. The goal is to investigate whether varying soft methods of aggregation would yield better results than Sum and Vote. We evaluate the performance of Sum, Product, MProduct, Minimum, Maximum, Median and Vote under varying parameters. The results over different training… (More)

3 Figures and Tables


  • Presentations referencing similar topics