Electronic Outlook - A Comparison of Human Outlook with a Computer-vision Solution

Mogens Blanke, Søren Hansen, Jonathan Dyssel Stets, Thomas Koester, Jesper E. Brøsted, Adrian Llopart Maurin, Nicolai Nykvist

Research output: Book/ReportReportResearchpeer-review

216 Downloads (Pure)

Abstract

Considering whether a temporarily unattended bridge could be allowed, Maritime Authorities wish to investigate whether sensor technology is available that, when seconded by sophisticated computer algorithms, is able to provide outlook with the same reliability and safety as that of the average human outlook. This document report findings from a comparative study of human versus electronic outlook. Assessment of navigator’s outlook is based on measurements with a wearable eye-tracker and areas of their visual attention are recorded on video. Simultaneously, a set of electro-optical sensors provides image-data as input to computer ©algorithms that detect and classify objects within visual range. Ambient light conditions on the bridge prevented eye-tracking to disclose which objects on Radar and ECDIS screens caught attention of the navigator. The scope of this investigation was therefore limited to focus on the navigator’s visual attention on objects. The report compares these eye-tracking measurements with object recognition made by camera recordings in the visual spectrum and subsequent computerized object classification. The report deducts, from the observations of eye fixations, when the navigator became aware of a particular object. It analyses how the human observations compare with those of the technology solution. On the technology side, the report presents approaches to detection and classification, which appeared to be efficient in coastal areas with confined passages. The quality of outlook in different ambient light conditions is illustrated. The main findings are:
• Eye-tracking glasses were found useful to show fixations on objects at sea in daylight conditions.
• The computer-vision algorithms detects objects in parallel, the human does so sequentially, and the computer classifies objects in average 24 sec faster than
the navigator has a fixation on the object. The deep learning algorithm trained in this study should, however, be improved to achieve better performance in some situations.
• The time between object detection and passage of own ship is adequate for making navigation decisions with both human and electronic outlook.
• Low-light conditions (dusk and night) are effectively dealt with by Long Wave InfraRed (LWIR) camera technology. LWIR shows objects as equally visible at day and night.
• Colour information from cameras is necessary to assist decision support and electronic navigation.
• A system for electronic outlook should employ sensor and data fusion with radar, AIS and ECDIS.
• Decision support based on electronic outlook should include object tracking and situation awareness techniques.
• Quality assurance and approval of machine learning algorithms for object classification at sea has unsolved issues. A standard vocabulary ought be available for objects at sea, and publicly available databases with annotated images from traffic in both open seas, near coast areas and rivers should be available in order for authorities to assess quality or approve navigation support based on machine learning methods.
Original languageEnglish
PublisherTechnical University of Denmark
Number of pages43
ISBN (Print)978-87-91184-07-9
DOIs
Publication statusPublished - 2019

Bibliographical note

© 2019 DTU and FORCE Technology. This report can be freely distributed with proper reference to the source. No changes are permitted in the document.

Fingerprint

Dive into the research topics of 'Electronic Outlook - A Comparison of Human Outlook with a Computer-vision Solution'. Together they form a unique fingerprint.
  • Electronic Outlook

    Blanke, M. (PI), Hansen, S. (Project Participant), Stets, J. D. (Project Participant), Koester, T. (Collaborative Partner) & Brøsted, J. E. (Project Participant)

    01/09/201730/09/2018

    Project: Research

Cite this