Noise-robust speech recognition using a cepstral noise reduction neural network architecture

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review


This paper addresses the problem of speech recognition in the presence of interfering non-stationary noise. A method for noise reduction in the cepstral domain based on a universal approximator is proposed and tested on a large database of isolated words contaminated with non-stationary F-16 jet cockpit noise. The speech recognition system consists of a concatenation of one auditory and two neural models i.e., an auditory preprocessing module, the Cepstral Noise Reduction network (CNR network) and a neural network classifier. The proposed CNR network architecture performs a nonlinear autoassociative mapping in the cepstral domain between a set of noisy cepstral coefficients from the preprocessing module and a set of noise-free cepstral coefficients. The output from the CNR network is input to the neural network classifier, in which the output functions are approximations to the Bayes optimal discriminant functions. Noise reduction is furthermore possible in the preprocessing module and in the classifier, thus the system can be considered as a three stage noise reduction system. The average recognition rate on a test database was improved up to 65 %, when the CNR network was added to the speech recognition system.
Original languageEnglish
Title of host publicationIJCNN-91
PublisherIEEE Press
Publication date1991
ISBN (Print)0-7803-0164-1
Publication statusPublished - 1991
Externally publishedYes
Event1991 International Joint Conference on Neural Networks - Seattle, WA, United States
Duration: 8 Jul 199112 Jul 1991


Conference1991 International Joint Conference on Neural Networks
Country/TerritoryUnited States
CitySeattle, WA
Internet address


Dive into the research topics of 'Noise-robust speech recognition using a cepstral noise reduction neural network architecture'. Together they form a unique fingerprint.

Cite this