Corpus ID: 233864637

Machine Collaboration

@article{Liu2021MachineC,
  title={Machine Collaboration},
  author={Q. Liu and Yang Feng},
  journal={ArXiv},
  year={2021},
  volume={abs/2105.02569}
}
  • Q. Liu, Yang Feng
  • Published 2021
  • Computer Science, Mathematics, Economics
  • ArXiv
We propose a new ensemble framework for supervised learning, called machine collaboration (MaC), using a collection of base machines for prediction tasks. Unlike bagging/stacking (a parallel & independent framework) and boosting (a sequential & top-down framework), MaC is a type of circular & interactive learning framework. The circular & interactive feature helps the base machines to transfer information circularly and update their structures and parameters accordingly. The theoretical result… Expand

Figures and Tables from this paper

References

SHOWING 1-10 OF 19 REFERENCES
A survey on ensemble learning
TLDR
Challenges and possible research directions for each mainstream approach of ensemble learning are presented and an extra introduction is given for the combination of ensemblelearning with other machine learning hot spots such as deep learning, reinforcement learning, etc. Expand
PMLB: a large benchmark suite for machine learning evaluation and comparison
TLDR
It is found that existing benchmarks lack the diversity to properly benchmark machine learning algorithms, and there are several gaps in benchmarking problems that still need to be considered. Expand
Super learner.
TLDR
A fast algorithm for constructing a super learner in prediction which uses V-fold cross-validation to select weights to combine an initial set of candidate learners. Expand
Bagging predictors
TLDR
Tests on real and simulated data sets using classification and regression trees and subset selection in linear regression show that bagging can give substantial gains in accuracy. Expand
Boosting the margin: A new explanation for the effectiveness of voting methods
TLDR
It is shown that techniques used in the analysis of Vapnik's support vector classifiers and of neural networks with small weights can be applied to voting methods to relate the margin distribution to the test error. Expand
Greedy function approximation: A gradient boosting machine.
Function estimation/approximation is viewed from the perspective of numerical optimization in function space, rather than parameter space. A connection is made between stagewise additive expansionsExpand
Support-Vector Networks
TLDR
High generalization ability of support-vector networks utilizing polynomial input transformations is demonstrated and the performance of the support- vector network is compared to various classical learning algorithms that all took part in a benchmark study of Optical Character Recognition. Expand
Semi-Supervised Ensemble Clustering Based on Selected Constraint Projection
TLDR
This work proposes double weighting semi-supervised ensemble clustering based on selected constraint projection (DCECP) which applies constraint weighting and ensemble member weighting to address limitations of traditional cluster ensemble approaches. Expand
Ensemble approaches for regression: A survey
TLDR
Different approaches to each of these phases that are able to deal with the regression problem are discussed, categorizing them in terms of their relevant characteristics and linking them to contributions from different fields. Expand
Ensemble learning: A survey
TLDR
The concept of ensemble learning is introduced, traditional, novel and state‐of‐the‐art ensemble methods are reviewed and current challenges and trends in the field are discussed. Expand
...
1
2
...