75 Languages, 1 Model: Parsing Universal Dependencies Universally

@article{Kondratyuk201975L1,
  title={75 Languages, 1 Model: Parsing Universal Dependencies Universally},
  author={Daniel Kondratyuk},
  journal={ArXiv},
  year={2019},
  volume={abs/1904.02099}
}
We present UDify, a multilingual multi-task model capable of accurately predicting universal part-of-speech, morphological features, lemmas, and dependency trees simultaneously for all 124 Universal Dependencies treebanks across 75 languages. By leveraging a multilingual BERT self-attention model pretrained on 104 languages, we found that fine-tuning it on all datasets concatenated together with simple softmax classifiers for each UD task can result in state-of-the-art UPOS, UFeats, Lemmas, UAS… CONTINUE READING

References

Publications referenced by this paper.
SHOWING 1-10 OF 20 REFERENCES

UDPipe 2.0 Prototype at CoNLL 2018 UD Shared Task

  • CoNLL Shared Task
  • 2018
VIEW 5 EXCERPTS
HIGHLY INFLUENTIAL

CoNLL 2017 Shared Task: Multilingual Parsing from Raw Text to Universal Dependencies

Daniel Zeman, Martin Popel, +64 authors Josie Li
  • CoNLL Shared Task
  • 2017
VIEW 6 EXCERPTS
HIGHLY INFLUENTIAL

Universal dependencies 2.3. LINDAT/CLARIN digital library at the Institute of Formal and Applied Linguistics (ÚFAL)

Joakim Nivre, Mitchell Abrams, Željko Agić, Ahrenberg
  • Faculty of Mathematics and Physics,
  • 2018
VIEW 3 EXCERPTS
HIGHLY INFLUENTIAL