Skip to search formSkip to main contentSkip to account menu

Random subspace method

Known as: Attribute bagging 
In machine learning. the random subspace method, also called attribute bagging or feature bagging, is an ensemble learning method that attempts to… 
Wikipedia (opens in a new tab)

Papers overview

Semantic Scholar uses AI to extract papers important to this topic.
Highly Cited
2017
Highly Cited
2017
We consider semi-supervised learning, learning task from both labeled and unlabeled instances and in particular, self-training… 
Highly Cited
2012
Highly Cited
2012
2011
2011
Learning in non-stationary environments is an increasingly important problem in a wide variety of real-world applications. In non… 
Highly Cited
2010
Highly Cited
2010
  • S. Kotsiantis
  • Artificial Intelligence Review
  • 2010
  • Corpus ID: 19682573
Bagging, boosting, rotation forest and random subspace methods are well known re-sampling ensemble methods that generate and… 
2008
2008
Semi-supervised learning has received much attention recently. Co-training is a kind of semi-supervised learning method which… 
Highly Cited
2006
Highly Cited
2006
Highly Cited
2002
Highly Cited
2002
Abstract: Recently bagging, boosting and the random subspace method have become popular combining techniques for improving weak… 
Highly Cited
1998
Highly Cited
1998
  • T. Ho
  • IEEE Transactions on Pattern Analysis and Machine…
  • 1998
  • Corpus ID: 206420153
Much of previous attention on decision trees focuses on the splitting criteria and optimization of tree sizes. The dilemma… 
Highly Cited
1998
Highly Cited
1998
  • T. Ho
  • SSPR/SPR
  • 1998
  • Corpus ID: 39796372
Recent studies have shown that the random subspace method can be used to create multiple independent tree-classifiers that can be…