What Matters for Neural Cross-Lingual Named Entity Recognition: An Empirical Analysis

@inproceedings{Huang2019WhatMF,
  title={What Matters for Neural Cross-Lingual Named Entity Recognition: An Empirical Analysis},
  author={Xiaolei Huang and Jonathan May and Nanyun Peng},
  booktitle={EMNLP/IJCNLP},
  year={2019}
}
  • Xiaolei Huang, Jonathan May, Nanyun Peng
  • Published in EMNLP/IJCNLP 2019
  • Computer Science
  • Building named entity recognition (NER) models for languages that do not have much training data is a challenging task. While recent work has shown promising results on cross-lingual transfer from high-resource languages to low-resource languages, it is unclear what knowledge is transferred. In this paper, we first propose a simple and efficient neural architecture for cross-lingual NER. Experiments show that our model achieves competitive performance with the state-of-the-art. We further… CONTINUE READING

    Create an AI-powered research feed to stay up to date with new papers like this posted to ArXiv

    Figures, Tables, and Topics from this paper.

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 30 REFERENCES

    A Survey of Cross-lingual Word Embedding Models

    VIEW 1 EXCERPT