Parallel Capsule Networks for Classification of White Blood Cells

  title={Parallel Capsule Networks for Classification of White Blood Cells},
  author={Juan P. Vigueras-Guill'en and Arijit Patra and Ola Engkvist and Frank Seeliger},
Capsule Networks (CapsNets) is a machine learning architecture proposed to overcome some of the shortcomings of convolutional neural networks (CNNs). However, CapsNets have mainly outperformed CNNs in datasets where images are small and/or the objects to identify have minimal background noise. In this work, we present a new architecture, parallel CapsNets, which exploits the concept of branching the network to isolate certain capsules, allowing each branch to identify different entities. We… Expand

Figures and Tables from this paper


Capsule Networks against Medical Imaging Data Challenges
The results suggest that capsule networks can be trained with less amount of data for the same or better performance and are more robust to an imbalanced class distribution, which makes the approach very promising for the medical imaging community. Expand
Brain Tumor Type Classification via Capsule Networks
The proposed approach can successfully overcome CNNs for the brain tumor classification problem and investigates the over-fitting problem of CapsNets based on a real set of MRI images. Expand
Capsules for Object Segmentation
The proposed convolutional-deconvolutional capsule network, called SegCaps, shows strong results for the task of object segmentation with substantial decrease in parameter space and is able to handle large image sizes as opposed to baseline capsules. Expand
Automated Classification of Apoptosis in Phase Contrast Microscopy Using Capsule Network
This paper proposes an efficient variant of capsule networks (CapsNets) as an alternative to CNNs and proposes a recurrent CapsNet constructed by stacking a CapsNet and a bi-directional long short-term recurrent structure to utilize temporal information within microscopy videos. Expand
Diagnosing Colorectal Polyps in the Wild with Capsule Networks
This work designs a novel capsule network architecture (D-Caps) to improve the viability of optical biopsy of colorectal polyps and demonstrates improved results over the previous state-of-the-art convolutional neural network (CNN) approach by as much as 43%. Expand
WBCaps: A Capsule Architecture-based Classification Model Designed for White Blood Cells Identification
  • Yan Liu, Ying Fu, Pu Chen
  • Medicine, Computer Science
  • 2019 41st Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC)
  • 2019
A capsule architecture-based classification model named WBCaps to automatically recognize the five types of white blood cells (WBC) from normal peripheral blood smears to be a promising clinical tool for hematology analyzer to facilitate the cytological and morphological examination. Expand
Self-Attention Capsule Networks for Object Classification
The proposed Self-Attention CapsNet significantly improved classification performance within and across different datasets and outperformed the baseline CapsNet, ResNet-18 and DenseNet-40 not only in classification accuracy but also in robustness. Expand
Dynamic Routing on Deep Neural Network for Thoracic Disease Classification and Sensitive Area Localization
A new deep neural network architecture for automatic thoracic disease detection on chest X-rays is presented and the combined benefits of model interpretability is shown by generating Gradient-weighted Class Activation Mapping (Grad-CAM) for localization. Expand
Breast Cancer Classification using Capsule Network with Preprocessed Histology Images
  • Anupama M A, S. V, S. K P
  • 2019 International Conference on Communication and Signal Processing (ICCSP)
  • 2019
Breast cancer is one of the most dangerous forms of cancer exists among women. The breast cancer is diagnosed using histology images. The purpose of this paper is to classify different types ofExpand
Densely Connected Convolutional Networks
The Dense Convolutional Network (DenseNet), which connects each layer to every other layer in a feed-forward fashion, and has several compelling advantages: they alleviate the vanishing-gradient problem, strengthen feature propagation, encourage feature reuse, and substantially reduce the number of parameters. Expand