RMDL: Random Multimodel Deep Learning for Classification

@inproceedings{Kowsari2018RMDLRM,
  title={RMDL: Random Multimodel Deep Learning for Classification},
  author={Kamran Kowsari and Mojtaba Heidarysafa and Donald E. Brown and K. Meimandi and Laura E. Barnes},
  booktitle={ICISDM '18},
  year={2018}
}
The continually increasing number of complex datasets each year necessitates ever improving machine learning methods for robust and accurate categorization of these data. This paper introduces Random Multimodel Deep Learning (RMDL): a new ensemble, deep learning approach for classification. Deep learning models have achieved state-of-the-art results across many domains. RMDL solves the problem of finding the best deep learning structure and architecture while simultaneously improving robustness… Expand
An Improvement of Data Classification Using Random Multimodel Deep Learning (RMDL)
TLDR
Random Multimodel Deep Learning (RMDL) is introduced: a new ensemble, deep learning approach for classification that solves the problem of finding the best deep learning structure and architecture while simultaneously improving robustness and accuracy through ensembles of deep learning architectures. Expand
System Fusion with Deep Ensembles
TLDR
This paper introduces a set of deep learning techniques for ensemble learning with dense, attention, and convolutional neural network layers that outperforms the existing state-of-the-art algorithms by a large margin. Expand
Deep learning methods for data classification
TLDR
Various deep learning methods developed to perform the data classification process in the data mining activity are discussed in this chapter. Expand
Building Deep Random Ferns Without Backpropagation
TLDR
A deep random ferns (d-RFs) model, in which extremely randomized f Ferns are connected to multilayers, allowing a high classification performance and a lightweight and fast structure is proposed. Expand
A Multi-blocked Image Classifier for Deep Learning
TLDR
The proposed multi-blocked model is designed in a way that it uses a minimum number of parameters so that it is able to run on a Graphical Processing Unit (GPU), which requires less power. Expand
A Novel Framework for Neural Architecture Search in the Hill Climbing Domain
TLDR
A new framework for neural architecture search based on a hill-climbing procedure using morphism operators that makes use of a novel gradient update scheme is proposed that can search in a broader search space which subsequently yields competitive results. Expand
Suggestion Mining from Online Reviews usingRandom Multimodel Deep Learning
TLDR
Random Multimodel Deep Learning (RMDL) is proposed which combines three different deep learning architectures (DNNs, RNNs and CNNs) and automatically selects the optimal hyper parameter to improve the robustness and flexibility of the model. Expand
Performance Evaluation of Deep Learning Algorithms in Biomedical Document Classification
TLDR
This paper investigates the deployment of the various state-of-the-art DL based classification models in automatic classification of benchmark biomedical datasets through the well-defined performance evaluation metrics such as accuracy, precision, recall, and f1measure. Expand
Metaheuristic Approach of RMDL Classification of Parkinson’s Disease
TLDR
This work implemented Random Multimodal Deep Learning (RMDL): a new ensemble, deep learning approach for classification that achieves better robustness and precision through ensembles of deep learning methods. Expand
Deep learning based multi-label text classification of UNGA resolutions
TLDR
A novel method is proposed that is able, through statistics like TF-IDF, to exploit pre-trained SOTA DL models (such as the Universal Sentence Encoder) without any need for traditional transfer learning or any other expensive training procedure. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 61 REFERENCES
Deep Learning using Linear Support Vector Machines
  • Y. Tang
  • Computer Science, Mathematics
  • 2013
TLDR
The results using L2-SVMs show that by simply replacing softmax with linear SVMs gives significant gains on popular deep learning datasets MNIST, CIFAR-10, and the ICML 2013 Representation Learning Workshop's face expression recognition challenge. Expand
Deep Forest: Towards An Alternative to Deep Neural Networks
In this paper, we propose gcForest, a decision tree ensemble approach with performance highly competitive to deep neural networks in a broad range of tasks. In contrast to deep neural networks whichExpand
HDLTex: Hierarchical Deep Learning for Text Classification
TLDR
Hierarchical Deep Learning for Text classification employs stacks of deep learning architectures to provide specialized understanding at each level of the document hierarchy. Expand
ImageNet classification with deep convolutional neural networks
TLDR
A large, deep convolutional neural network was trained to classify the 1.2 million high-resolution images in the ImageNet LSVRC-2010 contest into the 1000 different classes and employed a recently developed regularization method called "dropout" that proved to be very effective. Expand
Convolutional deep belief networks for scalable unsupervised learning of hierarchical representations
TLDR
The convolutional deep belief network is presented, a hierarchical generative model which scales to realistic image sizes and is translation-invariant and supports efficient bottom-up and top-down probabilistic inference. Expand
PCANet: A Simple Deep Learning Baseline for Image Classification?
TLDR
Surprisingly, for all tasks, such a seemingly naive PCANet model is on par with the state-of-the-art features either prefixed, highly hand-crafted, or carefully learned [by deep neural networks (DNNs)]. Expand
Maxout Networks
TLDR
A simple new model called maxout is defined designed to both facilitate optimization by dropout and improve the accuracy of dropout's fast approximate model averaging technique. Expand
Multi-column deep neural network for traffic sign classification
TLDR
This work uses a fast, fully parameterizable GPU implementation of a Deep Neural Network (DNN) that does not require careful design of pre-wired feature extractors, which are rather learned in a supervised way. Expand
Recurrent Convolutional Neural Networks for Text Classification
TLDR
A recurrent convolutional neural network is introduced for text classification without human-designed features to capture contextual information as far as possible when learning word representations, which may introduce considerably less noise compared to traditional window-based neural networks. Expand
BinaryConnect: Training Deep Neural Networks with binary weights during propagations
TLDR
BinaryConnect is introduced, a method which consists in training a DNN with binary weights during the forward and backward propagations, while retaining precision of the stored weights in which gradients are accumulated, and near state-of-the-art results with BinaryConnect are obtained on the permutation-invariant MNIST, CIFAR-10 and SVHN. Expand
...
1
2
3
4
5
...