Two Headed Dragons: Multimodal Fusion and Cross Modal Transactions

@article{Bose2021TwoHD,
  title={Two Headed Dragons: Multimodal Fusion and Cross Modal Transactions},
  author={Rupak Bose and Shivam Pande and Biplab Banerjee},
  journal={ArXiv},
  year={2021},
  volume={abs/2107.11585}
}
As the field of remote sensing is evolving, we witness the accumulation of information from several modalities, such as multispectral (MS), hyperspectral (HSI), LiDAR etc. Each of these modalities possess its own distinct characteristics and when combined synergistically, perform very well in the recognition and classification tasks. However, fusing multiple modalities in remote sensing is cumbersome due to highly disparate domains. Furthermore, the existing methods do not facilitate cross… Expand

Figures and Tables from this paper

References

SHOWING 1-10 OF 11 REFERENCES
FusAtNet: Dual Attention based SpectroSpatial Multimodal Fusion Network for Hyperspectral and LiDAR Classification
TLDR
The proposed FusAtNet framework achieves the state-of-the-art classification performance, including on the largest HSI-LiDAR dataset available, University of Houston (Data Fusion Contest - 2013), opening new avenues in multimodal feature fusion for classification. Expand
Deep Encoder-Decoder Networks for Classification of Hyperspectral and LiDAR Data
TLDR
This work presents a simple but effective multimodal DL baseline by following a deep encoder-decoder network architecture, EndNet for short, for the classification of hyperspectral and light detection and ranging (LiDAR) data. Expand
Multisource Remote Sensing Data Classification Based on Convolutional Neural Network
TLDR
The classification fusion of hyperspectral imagery (HSI) and data from other multiple sensors, such as light detection and ranging (LiDAR) data, is investigated with the state-of-the-art deep learning, named the two-branch convolution neural network (CNN). Expand
Hyperspectral and LiDAR Fusion Using Deep Three-Stream Convolutional Neural Networks
TLDR
A novel framework is proposed for the fusion of hyperspectral images and LiDAR-derived elevation data based on CNN and composite kernels to achieve higher spectral, spatial, and elevation separability of the extracted features and effectively perform multi-sensor data fusion in kernel space. Expand
Multisource Hyperspectral and LiDAR Data Fusion for Urban Land-Use Mapping based on a Modified Two-Branch Convolutional Neural Network
TLDR
A modified two-branch convolutional neural network for the adaptive fusion of hyperspectral imagery (HSI) and Light Detection and Ranging (LiDAR) data is proposed, sharing the same network structure to reduce the time cost of network design. Expand
Deep Fusion of Remote Sensing Data for Accurate Classification
TLDR
A new feature fusion framework based on deep neural networks (DNNs) to effectively extract features of multi-/hyperspectral and light detection and ranging data and provides competitive results in terms of classification accuracy. Expand
Attention is All you Need
TLDR
A new simple network architecture, the Transformer, based solely on attention mechanisms, dispensing with recurrence and convolutions entirely is proposed, which generalizes well to other tasks by applying it successfully to English constituency parsing both with large and limited training data. Expand
Adam: A Method for Stochastic Optimization
TLDR
This work introduces Adam, an algorithm for first-order gradient-based optimization of stochastic objective functions, based on adaptive estimates of lower-order moments, and provides a regret bound on the convergence rate that is comparable to the best known results under the online convex optimization framework. Expand
Understanding the difficulty of training deep feedforward neural networks
TLDR
The objective here is to understand better why standard gradient descent from random initialization is doing so poorly with deep neural networks, to better understand these recent relative successes and help design better algorithms in the future. Expand
Layer Normalization
Training state-of-the-art, deep neural networks is computationally expensive. One way to reduce the training time is to normalize the activities of the neurons. A recently introduced technique calledExpand
...
1
2
...