Corpus ID: 209531644

Stacked DeBERT: All Attention in Incomplete Data for Text Classification

@article{Sergio2020StackedDA,
  title={Stacked DeBERT: All Attention in Incomplete Data for Text Classification},
  author={Gwenaelle Cunha Sergio and Minho Lee},
  journal={ArXiv},
  year={2020},
  volume={abs/2001.00137}
}
  • Gwenaelle Cunha Sergio, Minho Lee
  • Published in ArXiv 2020
  • Computer Science
  • In this paper, we propose Stacked DeBERT, short for Stacked Denoising Bidirectional Encoder Representations from Transformers. This novel model improves robustness in incomplete data, when compared to existing systems, by designing a novel encoding scheme in BERT, a powerful language representation model solely based on attention mechanisms. Incomplete data in natural language processing refer to text with missing or incorrect words, and its presence can hinder the performance of current models… CONTINUE READING

    Create an AI-powered research feed to stay up to date with new papers like this posted to ArXiv

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 22 REFERENCES

    Attention is All you Need

    VIEW 4 EXCERPTS
    HIGHLY INFLUENTIAL

    Subword Semantic Hashing for Intent Classification on Small Datasets

    VIEW 2 EXCERPTS

    Temporal Hierarchies in Sequence to Sequence for Sentence Correction

    VIEW 1 EXCERPT

    Multiple Imputation Using Deep Denoising Autoencoders

    VIEW 1 EXCERPT