Causal binary mask estimation for speech enhancement using sparsity constraints

Abigail Anne Kressner, David V. Anderson, Christopher J. Rozell

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

Abstract

While most single-channel noise reduction algorithms fail to improve speech intelligibility, the ideal binary mask (IBM) has demonstrated substantial intelligibility improvements for both normal- and impaired-hearing listeners. However, this approach exploits oracle knowledge of the target and interferer signals to preserve only the time-frequency regions that are target-dominated. Single-channel noise suppression algorithms trying to approximate the IBM using locally estimated signal-to-noise ratios without oracle knowledge have had limited success. Thought of in another way, the IBM exploits the disjoint placement of the target and interferer in time and frequency to create a time-frequency signal representation that is more sparse (i.e., has fewer non-zeros). In recent work (submitted to ICASSP 2013) we have introduced a novel time-frequency masking algorithm based on a sparse approximation algorithm from the signal processing literature. However, the algorithm employs a non-causal estimator. The present work introduces an improved de-noising algorithm that uses more realistic frame-based (causal) computations to estimate a binary mask.
Original languageEnglish
Title of host publicationProceedings of Meetings on Acoustics
Number of pages9
Publication date2013
Article number055037
DOIs
Publication statusPublished - 2013
Event21st International Congress on Acoustics - Montreal, Canada
Duration: 2 Jun 20137 Jun 2013
Conference number: 21
http://www.ica2013montreal.org/

Conference

Conference21st International Congress on Acoustics
Number21
CountryCanada
CityMontreal
Period02/06/201307/06/2013
Internet address

Fingerprint Dive into the research topics of 'Causal binary mask estimation for speech enhancement using sparsity constraints'. Together they form a unique fingerprint.

Cite this