Lightweight SAR Ship Detection

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review


Non-cooperative vessels pose a challenge to traditional maritime surveillance systems. To overcome this challenge, alternative surveillance methods such as space-based monitoring sensors have been employed. However, the time-consuming process of satellite downlink hampers near-real-time applications. To address these issues, the use of onboard Artificial Intelligence for direct data processing has emerged as a key technology. This study explores the implementation of a lightweight Synthetic Aperture Radar ship detection model inspired by YOLOv8. The model achieves promising results on an annotated data-set, demonstrating the effectiveness of the approach for detecting both small and large ships. The study investigates the impact of atrous and depth-wise convolutions on the model’s performance and explores model quantization for further size reduction. Our final model has 0.3 million parameters and reached an average procession of 95.4 %. The results highlight the potential of lightweight models for onboard ship detection, offering comparable accuracy to larger models.
Original languageEnglish
Title of host publicationIGARSS 2023 - 2023 IEEE International Geoscience and Remote Sensing Symposium
Publication date2023
ISBN (Electronic)979-8-3503-2010-7
Publication statusPublished - 2023
Event2023 IEEE International Geoscience and Remote Sensing Symposium - Pasadena Convention Center, Pasadena, United States
Duration: 16 Jul 202321 Jul 2023
Conference number: 43


Conference2023 IEEE International Geoscience and Remote Sensing Symposium
LocationPasadena Convention Center
Country/TerritoryUnited States
SeriesIEEE International Geoscience and Remote Sensing Symposium Proceedings


  • Onboard Artificial Intelligence
  • Ship detection
  • Synthetic Aperture Radar
  • Maritime surveillance


Dive into the research topics of 'Lightweight SAR Ship Detection'. Together they form a unique fingerprint.

Cite this