Mapping auditory percepts into visual interfaces for hearing impaired users

Research output: Contribution to conferencePaper – Annual report year: 2018Researchpeer-review

Standard

Mapping auditory percepts into visual interfaces for hearing impaired users. / Johansen, Benjamin; Korzepa, Maciej Jan; Petersen, Michael Kai; Pontoppidan, Niels H.; Larsen, Jakob Eg.

2018. Paper presented at 2018 Conference on Human Factors in Computing Systems, Montréal, Canada.

Research output: Contribution to conferencePaper – Annual report year: 2018Researchpeer-review

Harvard

Johansen, B, Korzepa, MJ, Petersen, MK, Pontoppidan, NH & Larsen, JE 2018, 'Mapping auditory percepts into visual interfaces for hearing impaired users' Paper presented at 2018 Conference on Human Factors in Computing Systems, Montréal, Canada, 21/04/2018 - 26/04/2018, .

APA

Johansen, B., Korzepa, M. J., Petersen, M. K., Pontoppidan, N. H., & Larsen, J. E. (2018). Mapping auditory percepts into visual interfaces for hearing impaired users. Paper presented at 2018 Conference on Human Factors in Computing Systems, Montréal, Canada.

CBE

Johansen B, Korzepa MJ, Petersen MK, Pontoppidan NH, Larsen JE. 2018. Mapping auditory percepts into visual interfaces for hearing impaired users. Paper presented at 2018 Conference on Human Factors in Computing Systems, Montréal, Canada.

MLA

Vancouver

Johansen B, Korzepa MJ, Petersen MK, Pontoppidan NH, Larsen JE. Mapping auditory percepts into visual interfaces for hearing impaired users. 2018. Paper presented at 2018 Conference on Human Factors in Computing Systems, Montréal, Canada.

Author

Johansen, Benjamin ; Korzepa, Maciej Jan ; Petersen, Michael Kai ; Pontoppidan, Niels H. ; Larsen, Jakob Eg. / Mapping auditory percepts into visual interfaces for hearing impaired users. Paper presented at 2018 Conference on Human Factors in Computing Systems, Montréal, Canada.6 p.

Bibtex

@conference{21353f901a5e48379146c629899c6f89,
title = "Mapping auditory percepts into visual interfaces for hearing impaired users",
abstract = "Auditory-visual interfaces for hearing aid users have received limited attention in HCI research.We explore how to personalize audiological parameters by transforming auditory percepts into visual interfaces. In a pilot study (N = 10) we investigate the interaction patterns of smartphone connected hearing aids. We sketch out a visual interface based on two audiological parameters, brightness and directionality. We discuss how text labels and contrasting colors help users navigate in an auditory interface. And, how users by exploring an auditory interface may enhance the user experience of hearing aids. This study indicates that contextual preferences seemingly reflect cognitive differences in auditory processing. Based on the findings we propose four items, to be considered when designing auditory interfaces: 1) using a map to visualize audiological parameters, 2) applying visual metaphors, turning auditory preferences into actionable interface parameters, 3) supporting the user navigation by using visual markers, 4) capturing user intents when learning contextual preferences.",
author = "Benjamin Johansen and Korzepa, {Maciej Jan} and Petersen, {Michael Kai} and Pontoppidan, {Niels H.} and Larsen, {Jakob Eg}",
year = "2018",
language = "English",
note = "2018 Conference on Human Factors in Computing Systems, CHI’2018 ; Conference date: 21-04-2018 Through 26-04-2018",
url = "https://chi2018.acm.org",

}

RIS

TY - CONF

T1 - Mapping auditory percepts into visual interfaces for hearing impaired users

AU - Johansen, Benjamin

AU - Korzepa, Maciej Jan

AU - Petersen, Michael Kai

AU - Pontoppidan, Niels H.

AU - Larsen, Jakob Eg

PY - 2018

Y1 - 2018

N2 - Auditory-visual interfaces for hearing aid users have received limited attention in HCI research.We explore how to personalize audiological parameters by transforming auditory percepts into visual interfaces. In a pilot study (N = 10) we investigate the interaction patterns of smartphone connected hearing aids. We sketch out a visual interface based on two audiological parameters, brightness and directionality. We discuss how text labels and contrasting colors help users navigate in an auditory interface. And, how users by exploring an auditory interface may enhance the user experience of hearing aids. This study indicates that contextual preferences seemingly reflect cognitive differences in auditory processing. Based on the findings we propose four items, to be considered when designing auditory interfaces: 1) using a map to visualize audiological parameters, 2) applying visual metaphors, turning auditory preferences into actionable interface parameters, 3) supporting the user navigation by using visual markers, 4) capturing user intents when learning contextual preferences.

AB - Auditory-visual interfaces for hearing aid users have received limited attention in HCI research.We explore how to personalize audiological parameters by transforming auditory percepts into visual interfaces. In a pilot study (N = 10) we investigate the interaction patterns of smartphone connected hearing aids. We sketch out a visual interface based on two audiological parameters, brightness and directionality. We discuss how text labels and contrasting colors help users navigate in an auditory interface. And, how users by exploring an auditory interface may enhance the user experience of hearing aids. This study indicates that contextual preferences seemingly reflect cognitive differences in auditory processing. Based on the findings we propose four items, to be considered when designing auditory interfaces: 1) using a map to visualize audiological parameters, 2) applying visual metaphors, turning auditory preferences into actionable interface parameters, 3) supporting the user navigation by using visual markers, 4) capturing user intents when learning contextual preferences.

M3 - Paper

ER -