TY - GEN
T1 - Analyzing Deaf and Hard-of-Hearing Users' Behavior, Usage, and Interaction with a Personal Assistant Device that Understands Sign-Language Input
AU - Glasser, Abraham
AU - Watkins, Matthew
AU - Hart, Kira
AU - Lee, Sooyeon
AU - Huenerfauth, Matt
N1 - Publisher Copyright:
© 2022 ACM.
PY - 2022/4/29
Y1 - 2022/4/29
N2 - As voice-based personal assistant technologies proliferate, e.g., smart speakers in homes, and more generally as voice-control of technology becomes increasingly ubiquitous, new accessibility barriers are emerging for many Deaf and Hard of Hearing (DHH) users. Progress in sign-language recognition may enable devices to respond to sign-language commands and potentially mitigate these barriers, but research is needed to understand how DHH users would interact with these devices and what commands they would issue. In this work, we directly engage with the DHH community, using a Wizard-of-Oz prototype that appears to understand American Sign Language (ASL) commands. Our analysis of video recordings of DHH participants revealed how they woke-up the device to initiate commands, structured commands in ASL, and responded to device errors, providing guidance to future designers and researchers. We share our dataset of over 1400 commands, which may be of interest to sign-language-recognition researchers.
AB - As voice-based personal assistant technologies proliferate, e.g., smart speakers in homes, and more generally as voice-control of technology becomes increasingly ubiquitous, new accessibility barriers are emerging for many Deaf and Hard of Hearing (DHH) users. Progress in sign-language recognition may enable devices to respond to sign-language commands and potentially mitigate these barriers, but research is needed to understand how DHH users would interact with these devices and what commands they would issue. In this work, we directly engage with the DHH community, using a Wizard-of-Oz prototype that appears to understand American Sign Language (ASL) commands. Our analysis of video recordings of DHH participants revealed how they woke-up the device to initiate commands, structured commands in ASL, and responded to device errors, providing guidance to future designers and researchers. We share our dataset of over 1400 commands, which may be of interest to sign-language-recognition researchers.
KW - Accessibility
KW - Deaf and Hard of Hearing
KW - Personal Assistants
KW - Sign Language
UR - http://www.scopus.com/inward/record.url?scp=85130552614&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85130552614&partnerID=8YFLogxK
U2 - 10.1145/3491102.3501987
DO - 10.1145/3491102.3501987
M3 - Conference contribution
AN - SCOPUS:85130552614
T3 - Conference on Human Factors in Computing Systems - Proceedings
BT - CHI 2022 - Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems
PB - Association for Computing Machinery
T2 - 2022 CHI Conference on Human Factors in Computing Systems, CHI 2022
Y2 - 30 April 2022 through 5 May 2022
ER -