Deep Neural Decision Forests

@article{Kontschieder2015DeepND,
  title={Deep Neural Decision Forests},
  author={Peter Kontschieder and Madalina Fiterau and Antonio Criminisi and Samuel Rota Bul{\`o}},
  journal={2015 IEEE International Conference on Computer Vision (ICCV)},
  year={2015},
  pages={1467-1475}
}
We present Deep Neural Decision Forests - a novel approach that unifies classification trees with the representation learning functionality known from deep convolutional networks, by training them in an end-to-end manner. [] Key Method Our model differs from conventional deep networks because a decision forest provides the final predictions and it differs from conventional decision forests since we propose a principled, joint and global optimization of split and leaf node parameters. We show experimental…

Figures and Tables from this paper

Deep forest
TLDR
This study opens the door to deep learning based on non-differentiable modules without gradient-based adjustment, and exhibits the possibility of constructing deep models without backpropagation.
Decision Forests, Convolutional Networks and the Models in-Between
TLDR
This paper investigates the connections between two state of the art classifiers: decision forests (DFs, including decision jungles) and convolutional neural networks (CNNs) to achieve a continuum of hybrid models with different ratios of accuracy vs. efficiency.
Making CNNs Interpretable by Building Dynamic Sequential Decision Forests with Top-down Hierarchy Learning
TLDR
Experimental results show that dDSDF not only achieves higher classification accuracy than its conuterpart, i.e., the original CNN, but has much better interpretability, as qualitatively it has plausible hierarchies and quantitatively it leads to more precise saliency maps.
End-to-End Learning of Decision Trees and Forests
TLDR
This work presents an end-to-end learning scheme for deterministic decision trees and decision forests that can learn more complex split functions than common oblique ones, and facilitates interpretability through spatial regularization.
Ensemble of binary tree structured deep convolutional network for image classification
  • Ji-Eun Lee, Min-Joo Kang, Jewon Kang
  • Computer Science
    2017 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference (APSIPA ASC)
  • 2017
TLDR
An ensemble of tree- structured learning architecture to improve the discriminative capability of deep convolutional neural network (DCNN) for image classification using a coarse-to-fine approach where subsequent networks in children nodes are hierarchically and randomly organized to discriminate smaller sets of classes than those in a parent node.
Neural Oblivious Decision Ensembles for Deep Learning on Tabular Data
Nowadays, deep neural networks (DNNs) have become the main instrument for machine learning tasks within a wide range of domains, including vision, NLP, and speech. Meanwhile, in an important case of
End-to-end Learning of Deterministic Decision Trees
TLDR
This work proposes a model and Expectation-Maximization training scheme for decision trees that are fully probabilistic at train time, but after a deterministic annealing process become deterministic at test time and presents the first end-to-end learning scheme for deterministic decision trees.
Manifold Oblique Random Forests: Towards Closing the Gap on Convolutional Deep Networks
TLDR
Manifold Oblique Random Forest (Morf) algorithm outperforms approaches that ignore feature space structure and challenges the performance of ConvNets, and is demonstrated to run fast and maintains interpretability and theoretical justification.
DIFFERENTIABLE DECISION TREES
  • Computer Science
  • 2017
TLDR
This work introduces the differentiable decision tree (DDT) as a modular component of deep networks and a simple, differentiable loss function that allows for end-to-end optimization of a deep network to compress high-dimensional data for classification by a single decision tree.
Neural Forest Learning
TLDR
This work proposes Neural Forest Learning (NFL), a novel deep learning based random-forest-like method that enjoys the benefits of end-toend, data-driven representation learning, as well as pervasive support from deep learning software and hardware platforms, hence achieving faster inference speed and higher accuracy than previous forest methods.
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 61 REFERENCES
Neural Decision Forests for Semantic Image Labelling
TLDR
This work introduces randomized Multi- Layer Perceptrons (rMLP) as new split nodes which are capable of learning non-linear, data-specific representations and taking advantage of them by finding optimal predictions for the emerging child nodes.
Alternating Decision Forests
TLDR
This paper introduces a novel classification method termed Alternating Decision Forests (ADFs), which formulates the training of Random Forests explicitly as a global loss minimization problem, and derives the new classifier and gives a discussion and evaluation on standard machine learning data sets.
ImageNet classification with deep convolutional neural networks
TLDR
A large, deep convolutional neural network was trained to classify the 1.2 million high-resolution images in the ImageNet LSVRC-2010 contest into the 1000 different classes and employed a recently developed regularization method called "dropout" that proved to be very effective.
Network In Network
TLDR
With enhanced local modeling via the micro network, the proposed deep network structure NIN is able to utilize global average pooling over feature maps in the classification layer, which is easier to interpret and less prone to overfitting than traditional fully connected layers.
Delving Deep into Rectifiers: Surpassing Human-Level Performance on ImageNet Classification
TLDR
This work proposes a Parametric Rectified Linear Unit (PReLU) that generalizes the traditional rectified unit and derives a robust initialization method that particularly considers the rectifier nonlinearities.
Going deeper with convolutions
We propose a deep convolutional neural network architecture codenamed Inception that achieves the new state of the art for classification and detection in the ImageNet Large-Scale Visual Recognition
Entropy Nets: From Decision Trees to Neural Networks
TLDR
This paper shows how the mapping of decision trees into a multilayer neural network structure can be exploited for the systematic design of a class of layered neural networks, called entropy nets, that have far fewer connections.
DECISION TREES DO NOT GENERALIZE TO NEW VARIATIONS
TLDR
It is demonstrated formally that decision trees can be seriously hurt by the curse of dimensionality in a sense that is a bit different from other nonparametric statistical methods, but most importantly that they cannot generalize to variations not seen in the training set.
GeoF: Geodesic Forests for Learning Coupled Predictors
TLDR
A new and efficient forest based model that achieves spatially consistent semantic image segmentation by encoding variable dependencies directly in the feature space the forests operate on, which outperforms both state of theart forest models and the conventional pair wise CRF.
Dropout: a simple way to prevent neural networks from overfitting
TLDR
It is shown that dropout improves the performance of neural networks on supervised learning tasks in vision, speech recognition, document classification and computational biology, obtaining state-of-the-art results on many benchmark data sets.
...
1
2
3
4
5
...