TY - GEN
T1 - Face-Mic
T2 - 27th ACM Annual International Conference On Mobile Computing And Networking, MobiCom 2021
AU - Shi, Cong
AU - Xu, Xiangyu
AU - Zhang, Tianfang
AU - Walker, Payton
AU - Wu, Yi
AU - Liu, Jian
AU - Saxena, Nitesh
AU - Chen, Yingying
AU - Yu, Jiadi
N1 - Publisher Copyright:
© 2021 ACM.
PY - 2021
Y1 - 2021
N2 - Augmented reality/virtual reality (AR/VR) has extended beyond 3D immersive gaming to a broader array of applications, such as shopping, tourism, education. And recently there has been a large shift from handheld-controller dominated interactions to headset-dominated interactions via voice interfaces. In this work, we show a serious privacy risk of using voice interfaces while the user is wearing the face-mounted AR/VR devices. Specifically, we design an eavesdropping attack, Face-Mic, which leverages speech-Associated subtle facial dynamics captured by zero-permission motion sensors in AR/VR headsets to infer highly sensitive information from live human speech, including speaker gender, identity, and speech content. Face-Mic is grounded on a key insight that AR/VR headsets are closely mounted on the user's face, allowing a potentially malicious app on the headset to capture underlying facial dynamics as the wearer speaks, including movements of facial muscles and bone-borne vibrations, which encode private biometrics and speech characteristics. To mitigate the impacts of body movements, we develop a signal source separation technique to identify and separate the speech-Associated facial dynamics from other types of body movements. We further extract representative features with respect to the two types of facial dynamics. We successfully demonstrate the privacy leakage through AR/VR headsets by deriving the user's gender/identity and extracting speech information via the development of a deep learning-based framework. Extensive experiments using four mainstream VR headsets validate the generalizability, effectiveness, and high accuracy of Face-Mic.
AB - Augmented reality/virtual reality (AR/VR) has extended beyond 3D immersive gaming to a broader array of applications, such as shopping, tourism, education. And recently there has been a large shift from handheld-controller dominated interactions to headset-dominated interactions via voice interfaces. In this work, we show a serious privacy risk of using voice interfaces while the user is wearing the face-mounted AR/VR devices. Specifically, we design an eavesdropping attack, Face-Mic, which leverages speech-Associated subtle facial dynamics captured by zero-permission motion sensors in AR/VR headsets to infer highly sensitive information from live human speech, including speaker gender, identity, and speech content. Face-Mic is grounded on a key insight that AR/VR headsets are closely mounted on the user's face, allowing a potentially malicious app on the headset to capture underlying facial dynamics as the wearer speaks, including movements of facial muscles and bone-borne vibrations, which encode private biometrics and speech characteristics. To mitigate the impacts of body movements, we develop a signal source separation technique to identify and separate the speech-Associated facial dynamics from other types of body movements. We further extract representative features with respect to the two types of facial dynamics. We successfully demonstrate the privacy leakage through AR/VR headsets by deriving the user's gender/identity and extracting speech information via the development of a deep learning-based framework. Extensive experiments using four mainstream VR headsets validate the generalizability, effectiveness, and high accuracy of Face-Mic.
KW - AR/VR headsets
KW - facial dynamics
KW - speech and speaker privacy
UR - http://www.scopus.com/inward/record.url?scp=85179557229&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85179557229&partnerID=8YFLogxK
U2 - 10.1145/3447993.3483272
DO - 10.1145/3447993.3483272
M3 - Conference contribution
AN - SCOPUS:85179557229
T3 - Proceedings of the Annual International Conference on Mobile Computing and Networking, MOBICOM
SP - 478
EP - 490
BT - ACM MobiCom 2021 - Proceedings of the 27th ACM Annual International Conference On Mobile Computing And Networking
PB - Association for Computing Machinery
Y2 - 28 March 2022 through 1 April 2022
ER -