• Corpus ID: 88523511

Some variations on Ensembled Random Survival Forest with application to Cancer Research

@article{Dey2017SomeVO,
  title={Some variations on Ensembled Random Survival Forest with application to Cancer Research},
  author={Arabin Kumar Dey and N. Suhas and Talasila Sai Teja and Anshul Juneja},
  journal={arXiv: Methodology},
  year={2017}
}
In this paper we describe a novel implementation of adaboost for prediction of survival function. We take different variations of the algorithm and compare the algorithms based on system run time and root mean square error. Our construction includes right censoring data and competing risk data too. We take different data set to illustrate the performance of the algorithms. 

References

SHOWING 1-10 OF 18 REFERENCES

Random survival forests for competing risks.

A new approach to competing risks using random forests is introduced and it is shown that the method is highly effective for both prediction and variable selection in high-dimensional problems and in settings such as HIV/AIDS that involve many competing risks.

AdaBoost algorithm with random forests for predicting breast cancer survivability

Experimental results indicate that the proposed method outperforms a single classifier and other combined classifiers for the breast cancer survivability prediction.

Random Survival Forests

This article introduces random survival forests, a random forests method for the analysis of right-censored survival data, and extends Breiman’s random forests (RF) method, showing it to be highly accurate and comparable to state-of-the-art methods.

Regression Trees for Censored Data

The regression-tree methodology is extended to right-censored response variables by replacing the conventional splitting rules with rules based on the Tarone-Ware or Harrington-Fleming classes of

Strong Time Dependence of the 76-Gene Prognostic Signature for Node-Negative Breast Cancer Patients in the TRANSBIG Multicenter Independent Validation Series

This independent validation confirmed the performance of the 76-gene signature and adds to the growing evidence that gene expression signatures are of clinical relevance, especially for identifying patients at high risk of early distant metastases.

Extremely randomized trees

A new tree-based ensemble method for supervised classification and regression problems that consists of randomizing strongly both attribute and cut-point choice while splitting a tree node and builds totally randomized trees whose structures are independent of the output values of the learning sample.

Nelson–Aalen Estimator†

It is described how the Nelson–Aalen estimator can be used to provide a nonparametric estimate of the cumulative hazard rate function based on right censored and/or left truncated survival data, and

‘ Modest AdaBoost ’ – Teaching AdaBoost to Generalize Better

A new boosting algorithm is proposed, which produces less generalization error compared to mentioned algorithms at the cost of somewhat higher training error.

Bagging predictors

Tests on real and simulated data sets using classification and regression trees and subset selection in linear regression show that bagging can give substantial gains in accuracy.

A novel local success weighted ensemble classifier

A novel framework for dependent weighing of base classifiers instead of fixed weights, linked to the ability of the respective classifiers to correctly predict the labels, to curb the redundant false decisions responsible for mis-classification.