Abstract
Although deep learning has made strides in the field of deep noise suppression, leveraging deep architectures on resourceconstrained devices still proved challenging. Therefore, we present an early-exiting model based on nsNet2 that provides several levels of accuracy and resource savings by halting computations at different stages. Moreover, we adapt the original architecture by splitting the information flow to take into account the injected dynamism. We show the trade-offs between performance and computational complexity based on established metrics.
Original language | English |
---|---|
Title of host publication | Proceedings of the 2023 IEEE 33rd International Workshop on Machine Learning for Signal Processing (MLSP) |
Number of pages | 6 |
Publisher | IEEE |
Publication date | 2023 |
ISBN (Print) | 979-8-3503-2412-9 |
ISBN (Electronic) | 979-8-3503-2411-2 |
DOIs | |
Publication status | Published - 2023 |
Event | 2023 IEEE 33rd International Workshop on Machine Learning for Signal Processing - Rome, Italy, Rome, Italy Duration: 17 Sept 2023 → 20 Sept 2023 |
Conference
Conference | 2023 IEEE 33rd International Workshop on Machine Learning for Signal Processing |
---|---|
Location | Rome, Italy |
Country/Territory | Italy |
City | Rome |
Period | 17/09/2023 → 20/09/2023 |
Keywords
- Deep Noise Suppression
- Dynamic Neural Networks
- Early-exiting