Dynamic nsNET2: Efficient Deep Noise Suppression with Early Exiting

Riccardo Miccini, Alaa Zniber, Clément Laroche, Tobias Piechowiak, Martin Schoeberl, Luca Pezzarossa, Ouassim Karrakchou, Jens Sparsø, Mounir Ghogho

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

72 Downloads (Pure)

Abstract

Although deep learning has made strides in the field of deep noise suppression, leveraging deep architectures on resourceconstrained devices still proved challenging. Therefore, we present an early-exiting model based on nsNet2 that provides several levels of accuracy and resource savings by halting computations at different stages. Moreover, we adapt the original architecture by splitting the information flow to take into account the injected dynamism. We show the trade-offs between performance and computational complexity based on established metrics.
Original languageEnglish
Title of host publicationProceedings of the 2023 IEEE 33rd International Workshop on Machine Learning for Signal Processing (MLSP)
Number of pages6
PublisherIEEE
Publication date2023
ISBN (Print)979-8-3503-2412-9
ISBN (Electronic)979-8-3503-2411-2
DOIs
Publication statusPublished - 2023
Event2023 IEEE 33rd International Workshop on Machine Learning for Signal Processing - Rome, Italy, Rome, Italy
Duration: 17 Sept 202320 Sept 2023

Conference

Conference2023 IEEE 33rd International Workshop on Machine Learning for Signal Processing
LocationRome, Italy
Country/TerritoryItaly
CityRome
Period17/09/202320/09/2023

Keywords

  • Deep Noise Suppression
  • Dynamic Neural Networks
  • Early-exiting

Fingerprint

Dive into the research topics of 'Dynamic nsNET2: Efficient Deep Noise Suppression with Early Exiting'. Together they form a unique fingerprint.

Cite this