Projects per year
Abstract
Hearing-impaired listeners and aided hearing-impaired listeners have been shown to have degraded auditory localization abilities in auditory-only conditions, where information from other sensory modalities is not available to the listener. However, it is unclear how auditory localization performance in such listeners is affected in more realistic, daily-life, conditions, where they have access to additional cues that may aid localization, such as visual and self-motion cues. This thesis investigated how visual information affects spatial localization in normal and hearing-impaired listeners.
In the first study, a new analysis method was developed to distinguish between integration, i.e., a shift in perception, and response biases, i.e., a shift in decision making, in the spatial ventriloquist effect, a well-known phenomenon of audio-visual integration where the perceived location of an auditory stimulus is shifted towards the location of a visual stimulus. Response biases can result in an overestimation of both the shift in the perceived location of the auditory stimulus and the ‘spatial integration window’, i.e., the spatial distance in the horizontal plane between the auditory and the visual stimuli up to which they are integrated. Data of normal-hearing participants was gathered using this ventriloquist paradigm. A Gaussian clustering method was then used to cluster the localization data. These clusters were categorized into integrated, non-integrated and response bias clusters to allow for an unbiased analysis. With this new analysis method, the results showed that the spatial integration window is asymmetric, ranging from about -12 to +28 degrees, with a negative value indicating that the visual stimuli occurred closer to the center compared to the auditory stimulus.
The second study explored the effect of stimulus realism on the spatial ventriloquist effect, by comparing the visual bias evoked with various sets of stimuli, such as a ‘non-realistic’ noise burst and a light flash vs. a ‘realistic’ bouncing ball and an impact sound. As in the first study, it was found that the relative stimulus positioning affected the probability of integration. However, no effect of stimulus realism was found, i.e., the naturalness of the stimuli did not consistently affect the results. This is important as it suggests that the results from laboratory studies using non-natural stimuli will generalize to realistic situations with natural stimuli.
Virtual reality goggles have been shown to modify spatial localization cues and affect auditory localization. The third study investigated the effect of virtual reality goggles on the perceived location of sounds that were reproduced using ambisonics with and without visual information about the position of the loudspeakers. Participants perceived sounds to be further outwards when wearing the virtual reality goggles. This effect was found to be larger in the right than in the left hemisphere and it was largest around ±52.5 degrees azimuth. When visual information was available, auditory localization was strongly biased towards the visual sources. This bias towards visual sources generally improved localization accuracy, as compared to blindfolded auditory localization, when the auditory stimulus was simulated at a loudspeaker location. However, when the auditory stimulus was simulated in between loudspeakers, participants localized the auditory sources more accurately without visual information.
The fourth study investigated spatial integration in young normal-hearing, older normal-hearing and older hearing-impaired listeners to explore how age and hearing loss affect the spatial integration window. For this, a modified version, i.e., using relative instead of absolute localization, of the ventriloquist’s paradigm was used. The results demonstrated that the spatial integration window was increased in older listeners. However, no difference was found between older normal and older hearing-impaired listeners.
Finally, the last study explored congruent audio-visual localization behavior and howthis is affected by the number of auditory distractors. When the number of auditory distractors was low, the audio-visual area localization time, i.e., the time it took participants to get the target within their field of view, was consistent with the audio-only area localization time. However, as the number of distractors increased, visual information became more important. Audio-visual area localization times were significantly shorter than in audio-only conditions. Moreover, head-motion data showed that participants modified their behavior as the number of auditory distractors increased. Audio-visual target localization times, i.e., the time it took participants to find the target when it was already within the field of view, were consistently smaller than both auditory-only and visual-only target localization times. These results show that, instead of audiovisual localization being a combination of auditory area localization and visual target localization, the auditory and visual system contribute to both the area localization and the target localization.
Together, the experiments in this thesis demonstrate that visual information strongly influences auditory localization. The occurrence of the shift in the perceived location of auditory stimuli as a result of visual stimuli was affected by both the absolute and relative stimulus positioning as well as the participants’ age. However, realism, movement and hearing loss did not affect integration, at least when the stimuli were presented from the front direction. While auditory localization of hearing-impaired listeners was strongly biased towards visual information, the probability for this shift to occur was not higher than in normal-hearing listeners of the same age. Considering audio-visual localization behavior at increased angles, both the auditory and visual system were shown to contribute to finding the approximate area of a target and finding the target when it was within the field of view. Overall, these results show a strong connection of the auditory and visual system that was, at least in the front, unaffected by a hearing loss. These results may guide future research on audio-visual localization in hearing-impaired and aided-hearing impaired listeners and are likely to help in the design of new hearing-aid processing algorithms or deciding between already existing algorithms.
In the first study, a new analysis method was developed to distinguish between integration, i.e., a shift in perception, and response biases, i.e., a shift in decision making, in the spatial ventriloquist effect, a well-known phenomenon of audio-visual integration where the perceived location of an auditory stimulus is shifted towards the location of a visual stimulus. Response biases can result in an overestimation of both the shift in the perceived location of the auditory stimulus and the ‘spatial integration window’, i.e., the spatial distance in the horizontal plane between the auditory and the visual stimuli up to which they are integrated. Data of normal-hearing participants was gathered using this ventriloquist paradigm. A Gaussian clustering method was then used to cluster the localization data. These clusters were categorized into integrated, non-integrated and response bias clusters to allow for an unbiased analysis. With this new analysis method, the results showed that the spatial integration window is asymmetric, ranging from about -12 to +28 degrees, with a negative value indicating that the visual stimuli occurred closer to the center compared to the auditory stimulus.
The second study explored the effect of stimulus realism on the spatial ventriloquist effect, by comparing the visual bias evoked with various sets of stimuli, such as a ‘non-realistic’ noise burst and a light flash vs. a ‘realistic’ bouncing ball and an impact sound. As in the first study, it was found that the relative stimulus positioning affected the probability of integration. However, no effect of stimulus realism was found, i.e., the naturalness of the stimuli did not consistently affect the results. This is important as it suggests that the results from laboratory studies using non-natural stimuli will generalize to realistic situations with natural stimuli.
Virtual reality goggles have been shown to modify spatial localization cues and affect auditory localization. The third study investigated the effect of virtual reality goggles on the perceived location of sounds that were reproduced using ambisonics with and without visual information about the position of the loudspeakers. Participants perceived sounds to be further outwards when wearing the virtual reality goggles. This effect was found to be larger in the right than in the left hemisphere and it was largest around ±52.5 degrees azimuth. When visual information was available, auditory localization was strongly biased towards the visual sources. This bias towards visual sources generally improved localization accuracy, as compared to blindfolded auditory localization, when the auditory stimulus was simulated at a loudspeaker location. However, when the auditory stimulus was simulated in between loudspeakers, participants localized the auditory sources more accurately without visual information.
The fourth study investigated spatial integration in young normal-hearing, older normal-hearing and older hearing-impaired listeners to explore how age and hearing loss affect the spatial integration window. For this, a modified version, i.e., using relative instead of absolute localization, of the ventriloquist’s paradigm was used. The results demonstrated that the spatial integration window was increased in older listeners. However, no difference was found between older normal and older hearing-impaired listeners.
Finally, the last study explored congruent audio-visual localization behavior and howthis is affected by the number of auditory distractors. When the number of auditory distractors was low, the audio-visual area localization time, i.e., the time it took participants to get the target within their field of view, was consistent with the audio-only area localization time. However, as the number of distractors increased, visual information became more important. Audio-visual area localization times were significantly shorter than in audio-only conditions. Moreover, head-motion data showed that participants modified their behavior as the number of auditory distractors increased. Audio-visual target localization times, i.e., the time it took participants to find the target when it was already within the field of view, were consistently smaller than both auditory-only and visual-only target localization times. These results show that, instead of audiovisual localization being a combination of auditory area localization and visual target localization, the auditory and visual system contribute to both the area localization and the target localization.
Together, the experiments in this thesis demonstrate that visual information strongly influences auditory localization. The occurrence of the shift in the perceived location of auditory stimuli as a result of visual stimuli was affected by both the absolute and relative stimulus positioning as well as the participants’ age. However, realism, movement and hearing loss did not affect integration, at least when the stimuli were presented from the front direction. While auditory localization of hearing-impaired listeners was strongly biased towards visual information, the probability for this shift to occur was not higher than in normal-hearing listeners of the same age. Considering audio-visual localization behavior at increased angles, both the auditory and visual system were shown to contribute to finding the approximate area of a target and finding the target when it was within the field of view. Overall, these results show a strong connection of the auditory and visual system that was, at least in the front, unaffected by a hearing loss. These results may guide future research on audio-visual localization in hearing-impaired and aided-hearing impaired listeners and are likely to help in the design of new hearing-aid processing algorithms or deciding between already existing algorithms.
Original language | English |
---|
Publisher | DTU Health Technology |
---|---|
Number of pages | 221 |
Publication status | Published - 2021 |
Series | Contributions to Hearing Research |
---|---|
Volume | 48 |
Fingerprint
Dive into the research topics of 'The Influence of Vision on Spatial Localization in Normal-Hearing and Hearing-Impaired Listeners'. Together they form a unique fingerprint.Projects
- 1 Finished
-
The influence of vision on spatial hearing of hearing-impaired listeners
Huisman, T. (PhD Student), Stecker, C. (Examiner), Par, S. L. J. D. E. V. D. (Examiner), Marozeau, J. P. D. (Examiner), Dau, T. (Main Supervisor), MacDonald, E. (Supervisor) & Piechowiak, T. (Supervisor)
01/02/2018 → 16/12/2021
Project: PhD