Corpus ID: 52946958

On Learning and Learned Representation with Dynamic Routing in Capsule Networks

@article{Lin2018OnLA,
  title={On Learning and Learned Representation with Dynamic Routing in Capsule Networks},
  author={Ancheng Lin and Jun Yu Li and Zhenyuan Ma},
  journal={ArXiv},
  year={2018},
  volume={abs/1810.04041}
}
Capsule Networks (CapsNet) are recently proposed multi-stage computational models specialized for entity representation and discovery in image data. CapsNet employs iterative routing that shapes how the information cascades through different levels of interpretations. In this work, we investigate i) how the routing affects the CapsNet model fitting, ii) how the representation by capsules helps discover global structures in data distribution and iii) how learned data representation adapts and… Expand
Training Deep Capsule Networks
TLDR
This paper shows experimentally that the routing-by-agreement algorithm does not ensure the emergence of a parse tree in the network and introduces a new routing algorithm called dynamic deep routing, which allows the training of deeper capsule networks and is also more robust to white box adversarial attacks than the original routing algorithm. Expand
Capsule Networks – A survey
TLDR
A comprehensive review of the state of the art architectures, tools and methodologies in existing implementations of capsule networks highlights the successes, failures and opportunities for further research to serve as a motivation to researchers and industry players to exploit the full potential of this new field. Expand
Capsule Neural Networks for Graph Classification using Explicit Tensorial Graph Representations
TLDR
The proposed graph Capsule Network classification model using an explicit tensorial representation of the graphs is competitive with current state of the art graph kernels and graph neural network models despite only limited hyper-parameter searching. Expand
ARC-Net: Activity Recognition Through Capsules
TLDR
This paper introduces ARC-Net and proposes the utilization of capsules to fuse the information from multiple inertial measurement units (IMUs) to predict the activity performed by the subject and investigates the directionality of the confusion matrices of the results and discusses the specificity of the activities based on the provided data. Expand
3D Point Capsule Networks
TLDR
3D capsule networks are proposed, an auto-encoder designed to process sparse 3D point clouds while preserving spatial arrangements of the input data and enables new applications such as part interpolation and replacement. Expand
Capsule-Based Persian/Arabic Robust Handwritten Digit Recognition Using EM Routing
TLDR
The output of the system, clearly outperforms the results achieved by its ancestors, as well as other previously presented recognition algorithms. Expand

References

SHOWING 1-10 OF 32 REFERENCES
Matrix capsules with EM routing
Dynamic Routing Between Capsules
TLDR
It is shown that a discrimininatively trained, multi-layer capsule system achieves state-of-the-art performance on MNIST and is considerably better than a convolutional net at recognizing highly overlapping digits. Expand
Flexible Manifold Learning With Optimal Graph for Image and Video Representation
TLDR
The projective unsupervised flexible embedding models with optimal graph (PUFE-OG) is proposed, which builds an optimal graph by adjusting the affinity matrix by integrating the manifold regularizer and regression residual into a unified model. Expand
How transferable are features in deep neural networks?
TLDR
This paper quantifies the generality versus specificity of neurons in each layer of a deep convolutional neural network and reports a few surprising results, including that initializing a network with transferred features from almost any number of layers can produce a boost to generalization that lingers even after fine-tuning to the target dataset. Expand
Investigating Capsule Networks with Dynamic Routing for Text Classification
TLDR
This work proposes three strategies to stabilize the dynamic routing process to alleviate the disturbance of some noise capsules which may contain “background” information or have not been successfully trained. Expand
Sparse representation classification with manifold constraints transfer
TLDR
This paper proposes a novel framework that allows to embed manifold priors into sparse representation-based classification (SRC) approaches and defines an efficient alternating direction method of multipliers (ADMM) that can consistently integrate the manifold constraints during the optimization process. Expand
Deep Residual Learning for Image Recognition
TLDR
This work presents a residual learning framework to ease the training of networks that are substantially deeper than those used previously, and provides comprehensive empirical evidence showing that these residual networks are easier to optimize, and can gain accuracy from considerably increased depth. Expand
Improving Training of Deep Neural Networks via Singular Value Bounding
TLDR
This work proposes to constrain the solutions of weight matrices in the orthogonal feasible set during the whole process of network training, and achieves this by a simple yet effective method called Singular Value Bounding (SVB). Expand
A Fast Learning Algorithm for Deep Belief Nets
TLDR
A fast, greedy algorithm is derived that can learn deep, directed belief networks one layer at a time, provided the top two layers form an undirected associative memory. Expand
ImageNet classification with deep convolutional neural networks
TLDR
A large, deep convolutional neural network was trained to classify the 1.2 million high-resolution images in the ImageNet LSVRC-2010 contest into the 1000 different classes and employed a recently developed regularization method called "dropout" that proved to be very effective. Expand
...
1
2
3
4
...