Corpus ID: 231846449

Effects of Layer Freezing when Transferring DeepSpeech to New Languages

@article{Eberhard2021EffectsOL,
  title={Effects of Layer Freezing when Transferring DeepSpeech to New Languages},
  author={Onno Eberhard and Torsten Zesch},
  journal={ArXiv},
  year={2021},
  volume={abs/2102.04097}
}
In this paper, we train Mozilla’s DeepSpeech architecture on German and Swiss German speech datasets and compare the results of different training methods. We first train the models from scratch on both languages and then improve upon the results by using an English pretrained version of DeepSpeech for weight initialization and experiment with the effects of freezing different layers during training. We see that even freezing only one layer already improves the results dramatically. 

Figures and Tables from this paper

References

SHOWING 1-10 OF 17 REFERENCES
GermEval 2020 Task 4: Low-Resource Speech-to-Text
  • 5
  • PDF
Deep Speech: Scaling up end-to-end speech recognition
  • 1,209
  • PDF
Common Voice: A Massively-Multilingual Speech Corpus
  • 75
  • PDF
...
1
2
...