• Corpus ID: 56895355

Meta Learning for Few-shot Keyword Spotting

@article{Chen2018MetaLF,
  title={Meta Learning for Few-shot Keyword Spotting},
  author={Yangbin Chen and Tom Ko and Lifeng Shang and Xiao Chen and Xin Jiang and Qing Li},
  journal={ArXiv},
  year={2018},
  volume={abs/1812.10233}
}
Keyword spotting with limited training data is a challenging task which can be treated as a few-shot learning problem. In this paper, we present a meta-learning approach which learns a good initialization of the base KWS model from existed labeled dataset. Then it can quickly adapt to new tasks of keyword spotting with only a few labeled data. Furthermore, to strengthen the ability of distinguishing the keywords with the others, we incorporate the negative class as external knowledge to the… 

Figures and Tables from this paper

Few-Shot Keyword Spotting With Prototypical Networks
TLDR
This paper proposes a solution to the few-shot keyword spotting problem using temporal and dilated convolutions on prototypical networks and demonstrates keyword spotting of new keywords using just a small number of samples.
Metric Learning for Keyword Spotting
TLDR
A new method based on metric learning that maximises the distance between target and non-target key-words, but also learns per-class weights for target keywords as in classification objectives is proposed.
Towards Data-Efficient Modeling for Wake Word Spotting
TLDR
The proposed system is composed of a multi-condition training pipeline with stratified data augmentation, which improves the model robustness to a variety of predefined acoustic conditions, together with a semi-supervised learning pipeline to extract the WW and adversarial examples from untranscribed speech corpus.
Query-by-Example On-Device Keyword Spotting
TLDR
A threshold prediction method while using the user-specific keyword hypothesis only is proposed, which generates query-specific negatives by rearranging each query utterance in waveform and decides the threshold based on the enrollment queries and generated negatives.
Eliminating Data Collection Bottleneck for Wake Word Engine Training Using Found and Synthetic Data
TLDR
Novel techniques for curating WWE datasets that significantly minimizes the need for data collection from humans are presented and a cycle time savings of more than an order of magnitude is possible.
On Front-end Gain Invariant Modeling for Wake Word Spotting
TLDR
A novel approach to use a new feature called $\Delta$LFBE to decouple the AFE gain variations from the WW model is proposed, modified the neural network architectures to accommodate the delta computation, with the feature extraction module unchanged.

References

SHOWING 1-10 OF 35 REFERENCES
Deep Residual Learning for Small-Footprint Keyword Spotting
  • Raphael Tang, Jimmy J. Lin
  • Computer Science, Economics
    2018 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
  • 2018
TLDR
This work explores the application of deep residual learning and dilated convolutions to the keyword spotting task, using the recently-released Google Speech Commands Dataset as a benchmark and establishes an open-source state-of-the-art reference to support the development of future speech-based interfaces.
Optimization as a Model for Few-Shot Learning
Small-footprint keyword spotting using deep neural networks
TLDR
This application requires a keyword spotting system with a small memory footprint, low computational cost, and high precision, and proposes a simple approach based on deep neural networks that achieves 45% relative improvement with respect to a competitive Hidden Markov Model-based system.
Attention-based End-to-End Models for Small-Footprint Keyword Spotting
TLDR
An attention-based end-to-end neural approach for small-footprint keyword spotting (KWS), which aims to simplify the pipelines of building a production-quality KWS system, which outperforms the recent Deep KWS approach by a large margin and the best performance is achieved by CRNN.
Query-by-example keyword spotting using long short-term memory networks
TLDR
A novel approach to query-by-example keyword spotting (KWS) using a long short-term memory (LSTM) recurrent neural network-based feature extractor that has a small memory footprint, low computational cost, and high precision, making it suitable for on-device applications.
Convolutional neural networks for small-footprint keyword spotting
TLDR
This work explores using Convolutional Neural Networks for a small-footprint keyword spotting task and finds that the CNN architectures offer between a 27-44% relative improvement in false reject rate compared to a DNN, while fitting into the constraints of each application.
Meta-SGD: Learning to Learn Quickly for Few Shot Learning
TLDR
Meta-SGD, an SGD-like, easily trainable meta-learner that can initialize and adapt any differentiable learner in just one step, shows highly competitive performance for few-shot learning on regression, classification, and reinforcement learning.
Model Compression Applied to Small-Footprint Keyword Spotting
TLDR
Two ways to improve deep neural network acoustic models for keyword spotting without increasing CPU usage by using low-rank weight matrices throughout the DNN and knowledge distilled from an ensemble of much larger DNNs used only during training are investigated.
Convolutional Recurrent Neural Networks for Small-Footprint Keyword Spotting
TLDR
Systems and methods for creating and using Convolutional Recurrent Neural Networks for small-footprint keyword spotting (KWS) systems and a CRNN model embodiment demonstrated high accuracy and robust performance in a wide range of environments are described.
Meta Networks
TLDR
A novel meta learning method, Meta Networks (MetaNet), is introduced that learns a meta-level knowledge across tasks and shifts its inductive biases via fast parameterization for rapid generalization.
...
1
2
3
4
...