TY - GEN
T1 - Investigating microinteractions for people with visual impairments and the potential role of on-body interaction
AU - Oh, Uran
AU - Stearns, Lee
AU - Pradhan, Alisha
AU - Froehlich, Jon E.
AU - Indlater, Leah F.
N1 - Publisher Copyright:
© 2017 Association for Computing Machinery.
PY - 2017/10/19
Y1 - 2017/10/19
N2 - For screenreader users who are blind or visually impaired (VI), today's mobile devices, while reasonably accessible, are not necessarily efficient. This inefficiency may be especially problematic for microinteractions, which are brief but high-frequency interactions that take only a few seconds for sighted users to complete (e.g., checking the weather or for new messages). One potential solution to support efficient non-visual microinteractions is on-body input, which appropriates the user's own body as the interaction medium. In this paper, we address two related research questions: How well are microinteractions currently supported for VI users? How should on-body interaction be designed to best support microinteractions for this user group? We conducted two studies: (1) an online survey to compare current microinteraction use between VI and sighted users (N=117); and (2) an in-person study where 12 VI screenreader users qualitatively evaluated a real-time on-body interaction system that provided three contrasting input designs. Our findings suggest that efficient microinteractions are not currently well-supported for VI users, at least using manual input, which highlights the need for new interaction approaches. On-body input offers this potential and the qualitative evaluation revealed tradeoffs with different on-body interaction techniques in terms of perceived efficiency, learnability, social acceptability, and ability to use on the go.
AB - For screenreader users who are blind or visually impaired (VI), today's mobile devices, while reasonably accessible, are not necessarily efficient. This inefficiency may be especially problematic for microinteractions, which are brief but high-frequency interactions that take only a few seconds for sighted users to complete (e.g., checking the weather or for new messages). One potential solution to support efficient non-visual microinteractions is on-body input, which appropriates the user's own body as the interaction medium. In this paper, we address two related research questions: How well are microinteractions currently supported for VI users? How should on-body interaction be designed to best support microinteractions for this user group? We conducted two studies: (1) an online survey to compare current microinteraction use between VI and sighted users (N=117); and (2) an in-person study where 12 VI screenreader users qualitatively evaluated a real-time on-body interaction system that provided three contrasting input designs. Our findings suggest that efficient microinteractions are not currently well-supported for VI users, at least using manual input, which highlights the need for new interaction approaches. On-body input offers this potential and the qualitative evaluation revealed tradeoffs with different on-body interaction techniques in terms of perceived efficiency, learnability, social acceptability, and ability to use on the go.
KW - Microinteraction
KW - Mobile
KW - On-body interaction.
KW - Visual impairments
KW - Wearable technology
UR - http://www.scopus.com/inward/record.url?scp=85041400903&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85041400903&partnerID=8YFLogxK
U2 - 10.1145/3132525.3132536
DO - 10.1145/3132525.3132536
M3 - Conference contribution
AN - SCOPUS:85041400903
T3 - ASSETS 2017 - Proceedings of the 19th International ACM SIGACCESS Conference on Computers and Accessibility
SP - 22
EP - 31
BT - ASSETS 2017 - Proceedings of the 19th International ACM SIGACCESS Conference on Computers and Accessibility
PB - Association for Computing Machinery, Inc
T2 - 19th International ACM SIGACCESS Conference on Computers and Accessibility, ASSETS 2017
Y2 - 29 October 2017 through 1 November 2017
ER -