Corpus ID: 220250370

Switchblade - a Neural Network for Hard 2D Tasks

  title={Switchblade - a Neural Network for Hard 2D Tasks},
  author={Emils Ozolins and Karlis Freivalds and Agris Sostaks},
Convolutional neural networks have become the main tools for processing two-dimensional data. They work well for images, yet convolutions have a limited receptive field that prevents its applications to more complex 2D tasks. We propose a new neural network model, named Switchblade, that can efficiently exploit long-range dependencies in 2D data and solve much more challenging tasks. It has close-to-optimal $\mathcal{O}(n^2 \log{n})$ complexity for processing $n \times n$ data matrix. Besides… Expand


Residual Conv-Deconv Grid Network for Semantic Segmentation
The GridNet is a new Convolutional Neural Network architecture for semantic image segmentation that generalizes many well known networks such as conv-deconv, residual or U-Net networks and achieves competitive results on the Cityscapes dataset. Expand
A Comprehensive Survey on Graph Neural Networks
This article provides a comprehensive overview of graph neural networks (GNNs) in data mining and machine learning fields and proposes a new taxonomy to divide the state-of-the-art GNNs into four categories, namely, recurrent GNNS, convolutional GNN’s, graph autoencoders, and spatial–temporal Gnns. Expand
Fast-SCNN: Fast Semantic Segmentation Network
This paper introduces fast segmentation convolutional neural network (Fast-SCNN), an above real-time semantic segmentation model on high resolution image data suited to efficient computation on embedded devices with low memory. Expand
Very Deep Convolutional Networks for Large-Scale Image Recognition
This work investigates the effect of the convolutional network depth on its accuracy in the large-scale image recognition setting using an architecture with very small convolution filters, which shows that a significant improvement on the prior-art configurations can be achieved by pushing the depth to 16-19 weight layers. Expand
Attention is All you Need
A new simple network architecture, the Transformer, based solely on attention mechanisms, dispensing with recurrence and convolutions entirely is proposed, which generalizes well to other tasks by applying it successfully to English constituency parsing both with large and limited training data. Expand
U-Net: Convolutional Networks for Biomedical Image Segmentation
It is shown that such a network can be trained end-to-end from very few images and outperforms the prior best method (a sliding-window convolutional network) on the ISBI challenge for segmentation of neuronal structures in electron microscopic stacks. Expand
Neural Shuffle-Exchange Networks - Sequence Processing in O(n log n) Time
A new Shuffle-Exchange neural network model for sequence to sequence tasks which have O(log n) depth and O(n log n) total complexity is introduced and it is shown that this model is powerful enough to infer efficient algorithms for common algorithmic benchmarks including sorting, addition and multiplication. Expand
Deep Residual Learning for Image Recognition
This work presents a residual learning framework to ease the training of networks that are substantially deeper than those used previously, and provides comprehensive empirical evidence showing that these residual networks are easier to optimize, and can gain accuracy from considerably increased depth. Expand
Combinatorial Optimization with Graph Convolutional Networks and Guided Tree Search
Experimental results demonstrate that the presented approach substantially outperforms recent deep learning work, and performs on par with highly optimized state-of-the-art heuristic solvers for some NP-hard problems. Expand
Multi-Scale Context Aggregation by Dilated Convolutions
This work develops a new convolutional network module that is specifically designed for dense prediction, and shows that the presented context module increases the accuracy of state-of-the-art semantic segmentation systems. Expand