A sensitivity analysis on the effect of hyperparameters in deep neural operators applied to sound propagation

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

54 Downloads (Pure)

Abstract

Deep neural operators have seen much attention in the scientific machine learning community over the last couple of years due to their capability of efficiently learning the nonlinear operators mapping from input function spaces to output function spaces showing good generalization properties. This work will show how to set up a performant DeepONet architecture in acoustics for predicting 2-D sound fields with parameterized moving sources for real-time applications. A sensitivity analysis is carried out with a focus on the choice of network architectures, activation functions, Fourier feature expansions, and data fidelity to gain insight into how to tune these models. Specifically, a default feed-forward neural network (FNN), a modified FNN, and a convolutional neural network will be compared. This work will de-mystify the DeepONet and provide helpful knowledge from an acoustical point of view.
Original languageEnglish
Title of host publicationProceedings of 10th Convention of the European Acoustics Association
Number of pages8
Publication date2023
Publication statusPublished - 2023
Event10th Convention of the European Acoustics Association - Politecnico di Torino, Torino, Italy
Duration: 11 Sept 202315 Sept 2023
https://www.fa2023.org/

Conference

Conference10th Convention of the European Acoustics Association
LocationPolitecnico di Torino
Country/TerritoryItaly
CityTorino
Period11/09/202315/09/2023
Internet address

Keywords

  • Neural operators
  • Sensitivity analysis
  • Virtual acoustics
  • DeepONet

Fingerprint

Dive into the research topics of 'A sensitivity analysis on the effect of hyperparameters in deep neural operators applied to sound propagation'. Together they form a unique fingerprint.

Cite this