Abstract
Conversational voice assistants are often imbued with personality and human-like characteristics (e.g., gender). While researchers have begun to examine and design for the downstream societal impacts of voice assistants encoding characteristics such as gender, we know little about other human-like characteristics such as age that are encoded in an artificial, yet, anthropomorphic voice. As older adults continue to adopt voice assistants, we brought older adults into an activity to customize human-like characteristics for their voice assistant. Our findings reveal the different stereotypes and assumptions individuals associated with voice assistant characteristics (e.g., age, gender, race). We also describe individuals' motivations behind customizing or not customizing these characteristics. We discuss how biases get encoded through our design process, marginalizing older adults and other non-dominant user groups and call for a need to examine the systemic, yet unspoken, power structures encoded in anthropomorphic technologies.
Original language | English (US) |
---|---|
Article number | CSCW141 |
Journal | Proceedings of the ACM on Human-Computer Interaction |
Volume | 9 |
Issue number | 2 |
DOIs | |
State | Published - May 2 2025 |
All Science Journal Classification (ASJC) codes
- Social Sciences (miscellaneous)
- Human-Computer Interaction
- Computer Networks and Communications
Keywords
- age-bias
- older adults
- persona
- personality
- voice assistants