Vision-based Object Tracking in Marine Environments using Features from Neural Network Detections

Frederik Emil Thorsson Schöller, Mogens Blanke, Martin Krarup Plenge-Feidenhans'l, Lazaros Nalpantidis

    Research output: Contribution to journalConference articleResearchpeer-review

    178 Downloads (Pure)

    Abstract

    Autonomous decision support is desired to enable navigation with a temporally unattended bridge or to have the vessel navigated remotely. In order to have safe navigation, it is crucial to correctly interpret the current situation given any scenario. Proper perception of the surrounding environment is essential for good situational awareness. This paper suggests a method for tracking objects that have been detected by a neural network. The method utilises features that have been computed during the detection step, thereby ensuring good features that are representative for the given objects while saving the time it would take to compute new features. The suggested method is evaluated on data acquired in Danish near-coastal waters. Evaluation shows that the tracking method is able to track the detections well with few switches of object identity. The method is shown to outperform a similar tracking algorithm, while keeping the speed needed for real-time applications.
    Original languageEnglish
    Book seriesIFAC-PapersOnLine
    Volume53
    Issue number2
    Pages (from-to)14517-14523
    ISSN2405-8963
    DOIs
    Publication statusPublished - 2021
    EventIFAC World Congress 2020 - Virtual, Berlin, Germany
    Duration: 13 Jul 202017 Jul 2020
    https://www.ifac2020.org/

    Conference

    ConferenceIFAC World Congress 2020
    LocationVirtual
    Country/TerritoryGermany
    CityBerlin
    Period13/07/202017/07/2020
    Internet address

    Keywords

    • Autonomous Marine Vessels
    • Navigation
    • Computer Vision
    • Object Tracking

    Fingerprint

    Dive into the research topics of 'Vision-based Object Tracking in Marine Environments using Features from Neural Network Detections'. Together they form a unique fingerprint.

    Cite this