TY - GEN
T1 - GestRight
T2 - 2024 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2024
AU - Rippy, Kevin
AU - Gangopadhyay, Aryya
AU - Jayarajah, Kasthuri
N1 - Publisher Copyright:
© 2024 IEEE.
PY - 2024
Y1 - 2024
N2 - In this paper, we propose GestRight, a real-time system for gesture-based tele-operation of a mobile robot. For field use (e.g., smart factory settings, search and rescue missions, etc.), relying on tablet-based controls or joysticks are limiting which has led to the recent interest in hands-free operation of these assistive robots. In this work, we design three gesture-based schemes, namely, fist, touch, and wheel, represent three levels of precision-intuitiveness tradeoffs for low-level navigational control of mobile robots. GestRight includes a head-mounted device that captures hand joint data for accurate gesture recognition which is then translated to motion commands at an edge server. Through a user study involving seventeen participants, we present quantitative insights in comparison to traditional modes of control. Specifically, we evaluate GestRight in terms of the ease of navigational control, task time, and amount of errors/corrective actions required, run extensive statistical analyses, and provide a series of design recommendations for gesture-driven teleoperation systems. Our results show that gesture based schemes perform as well as traditional modes of control in contrast to participants' self-reports on how successful they felt in controlling the robots.
AB - In this paper, we propose GestRight, a real-time system for gesture-based tele-operation of a mobile robot. For field use (e.g., smart factory settings, search and rescue missions, etc.), relying on tablet-based controls or joysticks are limiting which has led to the recent interest in hands-free operation of these assistive robots. In this work, we design three gesture-based schemes, namely, fist, touch, and wheel, represent three levels of precision-intuitiveness tradeoffs for low-level navigational control of mobile robots. GestRight includes a head-mounted device that captures hand joint data for accurate gesture recognition which is then translated to motion commands at an edge server. Through a user study involving seventeen participants, we present quantitative insights in comparison to traditional modes of control. Specifically, we evaluate GestRight in terms of the ease of navigational control, task time, and amount of errors/corrective actions required, run extensive statistical analyses, and provide a series of design recommendations for gesture-driven teleoperation systems. Our results show that gesture based schemes perform as well as traditional modes of control in contrast to participants' self-reports on how successful they felt in controlling the robots.
UR - http://www.scopus.com/inward/record.url?scp=85216488119&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85216488119&partnerID=8YFLogxK
U2 - 10.1109/IROS58592.2024.10802649
DO - 10.1109/IROS58592.2024.10802649
M3 - Conference contribution
AN - SCOPUS:85216488119
T3 - IEEE International Conference on Intelligent Robots and Systems
SP - 13487
EP - 13494
BT - 2024 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2024
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 14 October 2024 through 18 October 2024
ER -