• Publications
  • Influence
An Optimized Cost-Sensitive SVM for Imbalanced Data Learning
TLDR
An effective wrapper framework incorporating the evaluation measure (AUC and G-mean) into the objective function of cost sensitive SVM directly to improve the performance of classification by simultaneously optimizing the best pair of feature subset, intrinsic parameters and misclassification cost parameters is presented.
L_DMI: A Novel Information-theoretic Loss Function for Training Deep Nets Robust to Label Noise
TLDR
A novel information-theoretic loss function, L_DMI, is proposed, which is the first loss function that is provably robust to instance-independent label noise, regardless of noise pattern, and it can be applied to any existing classification neural networks straightforwardly without any auxiliary information.
L_DMI: An Information-theoretic Noise-robust Loss Function
TLDR
A novel information-theoretic loss function, Determinant based Mutual Information (DMI), is proposed for training deep neural networks robust to label noise and empirically shows that using it outperforms all other counterparts in the classification task on both image dataset and natural language dataset.
Pulmonary Nodule Classification with Deep Convolutional Neural Networks on Computed Tomography Images
TLDR
A deep convolutional neural networks method is designed for nodule classification, which has an advantage of autolearning representation and strong generalization ability and consistently outperforms the competing methods.
A Spatio-temporal Transformer for 3D Human Motion Prediction
TLDR
The dual attention concept allows the model to access current and past information directly and to capture both the structural and the temporal dependencies explicitly and it is shown empirically that this effectively learns the underlying motion dynamics and reduces error accumulation over time observed in auto-regressive models.
Ensemble based adaptive over-sampling method for imbalanced data learning in computer aided detection of microaneurysm
TLDR
An ensemble based adaptive over-sampling algorithm for overcoming the class imbalance problem in the false positive reduction, and a Boosting, Bagging, Random subspace as the ensemble framework to improve microaneurysm detection are proposed.
Diabetic macular edema grading in retinal images using vector quantization and semi-supervised learning
TLDR
The proposed system overcomes the challenge of the DME grading and demonstrates a promising effectiveness, and the state-of-the-art approaches are compared in terms of performance.
Modeling Alzheimer's disease cognitive scores using multi-task sparse group lasso
TLDR
A multi-task sparse group lasso (MT-SGL) framework, which estimates sparse features coupled across tasks, and can work with loss functions associated with any Generalized Linear Models, is presented.
Ensemble-based hybrid probabilistic sampling for imbalanced data learning in lung nodule CAD
TLDR
Experimental results demonstrate the effectiveness of the proposed hybrid probabilistic sampling combined with diverse random subspace ensemble method in terms of geometric mean and area under the ROC curve (AUC) compared with commonly used methods.
Generalized fused group lasso regularized multi-task feature learning for predicting cognitive outcomes in Alzheimers disease
TLDR
The experimental results with real and synthetic data demonstrate that incorporating the two prior structures by the generalized fused group lasso norm into the multi task feature learning can improve the prediction performance over several state-of-the-art competing methods.
...
1
2
3
4
5
...