TY - GEN
T1 - Human Preferred Augmented Reality Visual Cues for Remote Robot Manipulation Assistance
T2 - 2023 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2023
AU - Krishnan, Achyuthan Unni
AU - Lin, Tsung Chi
AU - Li, Zhi
N1 - Publisher Copyright:
© 2023 IEEE.
PY - 2023
Y1 - 2023
N2 - When humans control or supervise remote robot manipulation, augmented reality (AR) visual cues overlaid on the remote camera video stream can effectively enhance human's remote perception of task and robot states, and comprehension of the robot autonomy's capability and intent. In this work, we conducted a user study (N=18) to investigate: (RQ1) what AR cues humans prefer when controlling the robot with various levels of autonomy, and (RQ2) whether this preference can be influenced by the way humans learn to use the interface. We provided AR visual cues of various types (e.g., motion guidance, obstacle indicator, target hint, autonomy activation and intent) to assist humans to pick and place an object around an obstacle on a counter workspace. We found that: 1) Participants prefer different types of AR cues based on the level of robot autonomy; 2) The AR cues the participants prefer to use after hands-on robot operation converged to the recommendation of experienced users, and may largely differ from their initial selection based on video instruction.
AB - When humans control or supervise remote robot manipulation, augmented reality (AR) visual cues overlaid on the remote camera video stream can effectively enhance human's remote perception of task and robot states, and comprehension of the robot autonomy's capability and intent. In this work, we conducted a user study (N=18) to investigate: (RQ1) what AR cues humans prefer when controlling the robot with various levels of autonomy, and (RQ2) whether this preference can be influenced by the way humans learn to use the interface. We provided AR visual cues of various types (e.g., motion guidance, obstacle indicator, target hint, autonomy activation and intent) to assist humans to pick and place an object around an obstacle on a counter workspace. We found that: 1) Participants prefer different types of AR cues based on the level of robot autonomy; 2) The AR cues the participants prefer to use after hands-on robot operation converged to the recommendation of experienced users, and may largely differ from their initial selection based on video instruction.
UR - https://www.scopus.com/pages/publications/85182524631
UR - https://www.scopus.com/pages/publications/85182524631#tab=citedBy
U2 - 10.1109/IROS55552.2023.10341969
DO - 10.1109/IROS55552.2023.10341969
M3 - Conference contribution
AN - SCOPUS:85182524631
T3 - IEEE International Conference on Intelligent Robots and Systems
SP - 7034
EP - 7039
BT - 2023 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2023
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 1 October 2023 through 5 October 2023
ER -