Missing-feature approaches in speech recognition

Abstract

In this article we have reviewed a wide variety of techniques based on the identification of missing spectral features that have proved effective in reducing the error rates of automatic speech recognition systems. These approaches have been conspicuously effective in ameliorating the effects of transient maskers such as impulsive noise or background music. We described two broad classes of missing feature algorithms: feature-vector imputation algorithms (which restore unreliable components of incoming feature vectors) and classifier-modification algorithms (which dynamically reconfigure the classifier itself to cope with the effects of unreliable feature components). We reviewed the mathematics of four major missing feature techniques: the feature-imputation techniques of cluster-based reconstruction and covariance-based reconstruction, and the classifier-modification methods of class-conditional imputation and marginalization. We also discussed the ways in which the common feature extraction procedures of cepstral analysis, temporal-difference features, and mean subtraction can be handled by speech recognition systems that make use of missing feature techniques. We concluded with a discussion of a small number of selected experimental results. These results confirm the effectiveness of all types of missing feature approaches discussed in ameliorating the effects of both stationary and transient noise, as well as the particular effectiveness of both soft masks and fragment decoding.

1 Figure or Table

Statistics

02040'04'05'06'07'08'09'10'11'12'13'14'15'16'17
Citations per Year

227 Citations

Semantic Scholar estimates that this publication has 227 citations based on the available data.

See our FAQ for additional information.

Cite this paper

@article{Raj2005MissingfeatureAI, title={Missing-feature approaches in speech recognition}, author={Bhiksha Raj and R Stern}, journal={IEEE Signal Processing Magazine}, year={2005}, volume={22}, pages={101-116} }