Rapid adaptation for deep neural networks through multi-task learning

@inproceedings{Huang2015RapidAF,
  title={Rapid adaptation for deep neural networks through multi-task learning},
  author={Zhen Huang and Jinyu Li and Sabato Marco Siniscalchi and I-Fan Chen and Ji Wu and Chin-Hui Lee},
  booktitle={INTERSPEECH},
  year={2015}
}
We propose a novel approach to addressing the adaptation effectiveness issue in parameter adaptation for deep neural network (DNN) based acoustic models for automatic speech recognition by adding one or more small auxiliary output layers modeling broad acoustic units, such as mono-phones or tied-state (often called senone) clusters. In scenarios with a limited amount of available adaptation data, most senones are usually rarely seen or not observed, and consequently the ability to model them in… CONTINUE READING
Highly Cited
This paper has 54 citations. REVIEW CITATIONS

From This Paper

Figures, tables, results, and topics from this paper.

Key Quantitative Results

  • Experimental results on the 20,000-word open vocabulary WSJ task demonstrate that the proposed framework consistently outperforms the conventional linear hidden layer adaptation schemes without MTL by providing 3.2% relative word error rate reduction (WERR) with only 1 single adaptation utterance, and 10.7% WERR with 40 adaptation utterances against the un-adapted DNN models.

Citations

Publications citing this paper.
Showing 1-10 of 35 extracted citations

55 Citations

01020302015201620172018
Citations per Year
Semantic Scholar estimates that this publication has 55 citations based on the available data.

See our FAQ for additional information.

References

Publications referenced by this paper.
Showing 1-10 of 41 references

Similar Papers

Loading similar papers…