Deep learning based segmentation of fish in noisy forward looking MBES images

Jesper Haahr Christensen*, Lars Valdemar Mogensen, Ole Ravn

*Corresponding author for this work

Research output: Contribution to journalConference articleResearchpeer-review

67 Downloads (Pure)


In this work, we investigate a Deep Learning (DL) approach to fish segmentation in a small dataset of noisy low-resolution images generated by a forward-looking multibeam echosounder (MBES). We build on recent advances in DL and Convolutional Neural Networks (CNNs) for semantic segmentation and demonstrate an end-to-end approach for a fish/non-fish probability prediction for all range-azimuth positions projected by an imaging sonar. We use self-collected datasets from the Danish Sound and the Faroe Islands to train and test our model and present techniques to obtain satisfying performance and generalization even with a low-volume dataset. We show that our model proves the desired performance and has learned to harness the importance of semantic context and take this into account to separate noise and non-targets from real targets. Furthermore, we present techniques to deploy models on low-cost embedded platforms to obtain higher performance fit for edge environments - where compute and power are restricted by size/cost - for testing and prototyping.

Original languageEnglish
Book seriesIFAC-PapersOnLine
Issue number2
Pages (from-to)14546-14551
Publication statusPublished - 2020
Event21st IFAC World Congress 2020 - Berlin, Germany
Duration: 12 Jul 202017 Jul 2020


Conference21st IFAC World Congress 2020

Bibliographical note

Publisher Copyright:
Copyright © 2020 The Authors. This is an open access article under the CC BY-NC-ND license


  • Autonomous underwater vehicle (AUV)
  • Deep learning
  • Fish monitoring
  • Multibeam echosounder (mbes) Imaging
  • Semantic segmentation
  • Sonar imaging


Dive into the research topics of 'Deep learning based segmentation of fish in noisy forward looking MBES images'. Together they form a unique fingerprint.

Cite this