Event-Based Classification of Defects in Civil Infrastructures with Artificial and Spiking Neural Networks

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

Abstract

Small Multirotor Autonomous Vehicles (MAVs) can be used to inspect civil infrastructure at height, improving safety and cost savings. However, there are challenges to be addressed, such as accurate visual inspection in high-contrast lighting and power efficiency for longer deployment times. Event cameras and Spiking Neural Networks (SNNs) can help solve these challenges, as event cameras are more robust to varying lighting conditions, and SNNs promise to be more power efficient on neuromorphic hardware. This work presents an initial investigation of the benefits of combining event cameras and SNNs for the onboard and real-time classification of civil structural defects. Results showed that event cameras allow higher defect classification accuracy than image-based methods under dynamic lighting conditions. Moreover, SNNs deployed into neuromorphic boards are 65-135 times more energy efficient than Artificial Neural Networks (ANNs) deployed into traditional hardware accelerators. This approach shows promise for reliable long-lasting drone-based visual inspections.
Original languageEnglish
Title of host publicationAdvances in Computational Intelligence
PublisherSpringer
Publication date2023
Pages629-40
ISBN (Print)978-3-031-43077-0
DOIs
Publication statusPublished - 2023
Event17th International Work-Conference on Artificial Neural Networks - Ponta Delgada, Portugal
Duration: 19 Jun 202321 Jun 2023

Conference

Conference17th International Work-Conference on Artificial Neural Networks
Country/TerritoryPortugal
CityPonta Delgada
Period19/06/202321/06/2023
SeriesLecture Notes in Computer Science
Volume14135
ISSN0302-9743

Fingerprint

Dive into the research topics of 'Event-Based Classification of Defects in Civil Infrastructures with Artificial and Spiking Neural Networks'. Together they form a unique fingerprint.

Cite this