Abstract
From sirens to lane markings, the urban environment is full of sounds that are designed to navigate the attention of the driver towards events that require special care. Microphone-equipped autonomous vehicles can also use these acoustic cues for increasing safety and performance. This article explores auditory perception in the context of autonomous driving and smart vehicles in general, examining the potential of exploiting acoustic cues in driverless vehicle technology. With a journey through the literature, we discuss various applications of auditory perception in driverless vehicles, ranging from the identification and localisation of external acoustic objects to leveraging ego-noise for motion estimation and engine fault detection. In addition to solutions already proposed in the literature, we also point out directions for further investigations, focusing in particular on parallel studies in the areas of acoustics and audio signal processing that demonstrate the potential for improving the performance of driverless cars.
Original language | English |
---|---|
Journal | IEEE Intelligent Transportation Systems Magazine |
Volume | 14 |
Issue number | 3 |
Pages (from-to) | 92-105 |
ISSN | 1939-1390 |
DOIs | |
Publication status | Published - 2022 |
Keywords
- Acoustic Signal Processing
- Autonomous Systems
- Autonomous Vehicles
- Machine Learning