Low-Resource Sequence Labeling via Unsupervised Multilingual Contextualized Representations

@inproceedings{Bao2019LowResourceSL,
  title={Low-Resource Sequence Labeling via Unsupervised Multilingual Contextualized Representations},
  author={Zuyi Bao and Rui Huang and C. Li and Kenny Zhu},
  booktitle={EMNLP/IJCNLP},
  year={2019}
}
  • Zuyi Bao, Rui Huang, +1 author Kenny Zhu
  • Published in EMNLP/IJCNLP 2019
  • Computer Science
  • Previous work on cross-lingual sequence labeling tasks either requires parallel data or bridges the two languages through word-byword matching. Such requirements and assumptions are infeasible for most languages, especially for languages with large linguistic distances, e.g., English and Chinese. In this work, we propose a Multilingual Language Model with deep semantic Alignment (MLMA) to generate language-independent representations for cross-lingual sequence labeling. Our methods require only… CONTINUE READING

    Figures, Tables, and Topics from this paper.

    Explore Further: Topics Discussed in This Paper

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 53 REFERENCES
    Model Transfer for Tagging Low-resource Languages using a Bilingual Dictionary
    37
    Neural Cross-Lingual Named Entity Recognition with Minimal Resources
    64
    Joint bilingual name tagging for parallel corpora
    41
    Word Translation Without Parallel Data
    639
    A Multi-lingual Multi-task Architecture for Low-resource Sequence Labeling
    48
    Unsupervised Part-of-Speech Tagging with Bilingual Graph-Based Projections
    283
    Cheap Translation for Cross-Lingual Named Entity Recognition
    51
    Weakly Supervised Cross-Lingual Named Entity Recognition via Effective Annotation and Representation Projection
    54
    Cross-lingual Projected Expectation Regularization for Weakly Supervised Learning
    48
    Cross-Lingual Transfer Learning for POS Tagging without Cross-Lingual Resources
    63