An analysis of boosted ensembles of binary fuzzy decision trees

@article{Barsacchi2020AnAO,
  title={An analysis of boosted ensembles of binary fuzzy decision trees},
  author={Marco Barsacchi and Alessio Bechini and Francesco Marcelloni},
  journal={Expert Syst. Appl.},
  year={2020},
  volume={154},
  pages={113436}
}
Abstract Classification is a functionality that plays a central role in the development of modern expert systems, across a wide variety of application fields: using accurate, efficient, and compact classification models is often a prime requirement. Boosting (and AdaBoost in particular) is a well-known technique to obtain robust classifiers from properly-learned weak classifiers, thus it is particularly attracting in many practical settings. Although the use of traditional classifiers as base… 
Implicitly distributed fuzzy random forests
TLDR
DFRF, a novel distributed fuzzy random forest induction algorithm, based on a fuzzy discretizer for continuous attributes is proposed, although shaped on the MapReduce programming model, takes advantage of the implicit distribution of the computation provided by the Apache Spark framework.
Gradient and Newton Boosting for Classification and Regression
  • Fabio Sigrist
  • Computer Science, Mathematics
    Expert Syst. Appl.
  • 2021
TLDR
The experiments show that Newton boosting outperforms gradient and hybrid gradient-Newton boosting in terms of predictive accuracy on the majority of datasets, and empirical evidence is presented that this difference in predictive accuracy is not primarily due to faster convergence of Newton boosting, but rather since Newton boosting often achieves lower test errors while at the same time having lower training losses.
Fuzzy Hoeffding Decision Tree for Data Stream Classification
TLDR
FHDT is introduced, a fuzzy HDT that extends HDT with fuzziness, thus making HDT more robust to noisy and vague data, and it is shown that FHDT outperforms HDT, especially in presence of concept drift.
Decision Trees-based Anomaly Detection in Computer Assessment Results
A survey of the current software in the area of computer assessments of students at a university is done. This work shows ways to improve quality of computer assessments and educational management.
Silas: A high-performance machine learning foundation for logical reasoning and verification
TLDR
The potential capabilities of the fusion of machine learning and logical reasoning are illustrated by showcasing applications in three directions: formal verification of the prediction model against user specifications, training correct-by-construction models, and explaining the decision-making of predictions.
Decision Tree and AHP Methods Application for Projects Assessment: A Case Study
TLDR
A new model that combines the decision-making process modelling with the AHP method and includes the analysis of model stability concerning stakeholders’ behaviour is addressed, showing that, sometimes, small fluctuations in the project factors affect the project selection indicating the possible lack of the robustness of the project decisions.
Improving Matching Process with Expanding and Classifying Criterial Keywords leveraging Word Embedding and Hierarchical Clustering Methods
TLDR
This study proposes solutions by extracting criterial keywords from social networking services (SNSs) based on word embedding and by classifying the obtained keywords via hierarchical clustering to enable enterprises to gather and prioritizeCriterial keywords more accurately to improve their matching processes.

References

SHOWING 1-10 OF 63 REFERENCES
Multi-class boosting with fuzzy decision trees
TLDR
FDT-Boost is proposed, a boosting approach shaped according to the multi-class SAMME-AdaBoost scheme, that employs size-constrained fuzzy binary decision trees as weak classifiers that is accurate, yet keeping low the model complexity in terms of total number of leaf nodes.
An Experimental Comparison of Three Methods for Constructing Ensembles of Decision Trees: Bagging, Boosting, and Randomization
TLDR
The experiments show that in situations with little or no classification noise, randomization is competitive with (and perhaps slightly superior to) bagging but not as accurate as boosting, and sometimes better than randomization.
Example-dependent cost-sensitive adaptive boosting
TLDR
The ECS generalization of AdaBoost is presented, providing new insight regarding the behavior of the cost-sensitive model from a theoretical point of view and proving that the presented approach can significantly improve the practical design of intelligent systems.
Why Fuzzy Decision Trees are Good Rankers
TLDR
It is shown that a fuzzy extension of decision trees is arguably more useful for another performance task, namely ranking, and some properties that seem to be crucial for a good ranking performance-properties that are better and more naturally offered by fuzzy than by conventional decision trees.
Fuzzy classification trees for data analysis
TLDR
This paper presents the basic definition of fuzzy classification trees along with their construction algorithm, a new model that integrates the fuzzy classifiers with decision trees, that can work well in classifying the data with noise.
A fast and efficient multi-objective evolutionary learning scheme for fuzzy rule-based classifiers
TLDR
An MOEA-based approach to learn concurrently the rule and data bases of fuzzy rule-based classifiers (FRBCs) by exploiting a rule and condition selection (RCS) strategy, which selects a reduced number of rules from a heuristically generated set of candidate rules and a reduction of conditions for each selected rule during the evolutionary process.
Multi-class AdaBoost ∗
TLDR
A new algorithm is proposed that naturally extends the original AdaBoost algorithm to the multiclass case without reducing it to multiple two-class problems and is extremely easy to implement and is highly competitive with the best currently available multi-class classification methods.
A Decision-Theoretic Generalization of On-Line Learning and an Application to Boosting
TLDR
The model studied can be interpreted as a broad, abstract extension of the well-studied on-line prediction model to a general decision-theoretic setting, and it is shown that the multiplicative weight-update Littlestone?Warmuth rule can be adapted to this model, yielding bounds that are slightly weaker in some cases, but applicable to a considerably more general class of learning problems.
A study of the behaviour of linguistic fuzzy rule based classification systems in the framework of imbalanced data-sets
TLDR
The necessity of applying a preprocessing step to deal with the problem of imbalanced data-sets is analyzed and the granularity of the fuzzy partitions, the use of distinct conjunction operators, the application of some approaches to compute the rule weights and theUse of different fuzzy reasoning methods are analyzed.
A multi-class boosting method with direct optimization
TLDR
As a non-convex optimization method, DMCBoost shows competitive or better results than state-of-the-art convex relaxation boosting methods, and it performs especially well on the noisy cases.
...
1
2
3
4
5
...