• Corpus ID: 245424725

Investigating Effect of Dialogue History in Multilingual Task Oriented Dialogue Systems

@article{Sun2021InvestigatingEO,
  title={Investigating Effect of Dialogue History in Multilingual Task Oriented Dialogue Systems},
  author={Michael Mu Sun and Kaili Huang and M. Moradshahi},
  journal={ArXiv},
  year={2021},
  volume={abs/2112.12318}
}
While the English virtual assistants have achieved exciting performance with an enormous amount of training resources, the needs of non-English-speakers have not been satisfied well. Up to Dec 2021, Alexa, one of the most popular smart speakers around the world, is able to support 9 different languages[1], while there are thousands of languages in the world, 91 of which are spoken by more than 10 million people according to statistics published in 2019[2]. However, training a virtual assistant… 

Tables from this paper

References

SHOWING 1-4 OF 4 REFERENCES
BiToD: A Bilingual Multi-Domain Dataset For Task-Oriented Dialogue Modeling
TLDR
BiToD2 is introduced, the first bilingual multi-domain dataset for end-to-end task-oriented dialogue modeling and provides state-of-the-art baselines under three evaluation settings (monolingual, bilingual, and cross-lingual).
End-to-End Task-Completion Neural Dialogue Systems
TLDR
The end-to-end system not only outperforms modularized dialogue system baselines for both objective and subjective evaluation, but also is robust to noises as demonstrated by several systematic experiments with different error granularity and rates specific to the language understanding module.
Overview of the Ninth Dialog System Technology Challenge: DSTC9
TLDR
The task definition is described, provided datasets, baselines and evaluation set-up for each track, and the results of the submitted systems are summarized to highlight the overall trends of the state-of-the-art technologies for the tasks.
Multilingual Denoising Pre-training for Neural Machine Translation
Abstract This paper demonstrates that multilingual denoising pre-training produces significant performance gains across a wide variety of machine translation (MT) tasks. We present mBART—a