How Well Can Driverless Vehicles Hear? A Gentle Introduction to Auditory Perception for Autonomous and Smart Vehicles

Letizia Marchegiani, Xenofon Fafoutis

Research output: Contribution to journalJournal articleResearchpeer-review

Abstract

From sirens to lane markings, the urban environment is full of sounds that are designed to navigate the attention of the driver towards events that require special care. Microphone-equipped autonomous vehicles can also use these acoustic cues for increasing safety and performance. This article explores auditory perception in the context of autonomous driving and smart vehicles in general, examining the potential of exploiting acoustic cues in driverless vehicle technology. With a journey through the literature, we discuss various applications of auditory perception in driverless vehicles, ranging from the identification and localisation of external acoustic objects to leveraging ego-noise for motion estimation and engine fault detection. In addition to solutions already proposed in the literature, we also point out directions for further investigations, focusing in particular on parallel studies in the areas of acoustics and audio signal processing that demonstrate the potential for improving the performance of driverless cars.
Original languageEnglish
JournalIEEE Intelligent Transportation Systems Magazine
Number of pages12
ISSN1939-1390
Publication statusAccepted/In press - 2021

Keywords

  • Acoustic Signal Processing
  • Autonomous Systems
  • Autonomous Vehicles
  • Machine Learning

Fingerprint Dive into the research topics of 'How Well Can Driverless Vehicles Hear? A Gentle Introduction to Auditory Perception for Autonomous and Smart Vehicles'. Together they form a unique fingerprint.

Cite this