Skip to search formSkip to main content
You are currently offline. Some features of the site may not work correctly.

Random subspace method

Known as: Attribute bagging 
In machine learning. the random subspace method, also called attribute bagging or feature bagging, is an ensemble learning method that attempts to… Expand
Wikipedia

Papers overview

Semantic Scholar uses AI to extract papers important to this topic.
2013
2013
In this paper, we show that the way internal estimates are used to measure variable importance in Random Forests are also… Expand
  • table 1
  • table 2
  • table 3
  • table 4
  • table 5
Highly Cited
2013
Highly Cited
2013
Abstract : As anticipated in True Names by Vernor Vinge, identity has been recognized as our most valued possession in cyberspace… Expand
  • figure 1
  • figure 2
  • figure 4
  • figure 5
  • figure 3
2011
2011
Learning in non-stationary environments is an increasingly important problem in a wide variety of real-world applications. In non… Expand
  • table I
  • figure 1
  • figure 2
  • figure 3
Highly Cited
2010
Highly Cited
2010
  • S. Kotsiantis
  • Artificial Intelligence Review
  • 2010
  • Corpus ID: 19682573
Bagging, boosting, rotation forest and random subspace methods are well known re-sampling ensemble methods that generate and… Expand
  • figure 1
  • table 1
  • table 2
  • table 3
  • table 4
Highly Cited
2009
Highly Cited
2009
The small sample size (SSS) and the sensitivity to variations such as illumination, expression and occlusion are two challenging… Expand
  • figure 1
  • figure 2
  • figure 3
  • table 1
  • table 3
2008
2008
Semi-supervised learning has received much attention recently. Co-training is a kind of semi-supervised learning method which… Expand
Highly Cited
2006
Highly Cited
2006
In a growing number of domains data captured encapsulates as many features as possible. This poses a challenge to classical… Expand
  • figure 1
  • figure 2
  • figure 3
Highly Cited
2002
Highly Cited
2002
Abstract: Recently bagging, boosting and the random subspace method have become popular combining techniques for improving weak… Expand
  • figure 1
  • figure 2
  • figure 3
  • figure 4
  • figure 6
Highly Cited
1998
Highly Cited
1998
  • T. Ho
  • IEEE Trans. Pattern Anal. Mach. Intell.
  • 1998
  • Corpus ID: 206420153
Much of previous attention on decision trees focuses on the splitting criteria and optimization of tree sizes. The dilemma… Expand
  • figure 1
  • figure 2
  • table 1
  • figure 3
  • table 2
Highly Cited
1998
Highly Cited
1998
  • T. Ho
  • SSPR/SPR
  • 1998
  • Corpus ID: 39796372
Recent studies have shown that the random subspace method can be used to create multiple independent tree-classifiers that can be… Expand