TY - GEN
T1 - Integrated Object, Skill, and Motion Models for Nonprehensile Manipulation
AU - Akash, Muhaiminul Islam
AU - Bhattacharya, Rituja
AU - Zurzolo, Lorenzo
AU - Qiu, Qinyin
AU - Adamovich, Sergei
AU - Wang, Cong
N1 - Publisher Copyright:
© 2024 IEEE.
PY - 2024
Y1 - 2024
N2 - Advanced hand skills for object manipulation can greatly enhance the physical capability of robots in a variety of applications. Models that can comprehensively and ubiquitously capture semantic information from the demonstration data are essential for robots to learn skills and act autonomously. Compared to object manipulation with firm grasping, nonprehensile manipulation skills can significantly extend the manipulation ability of robots but are also challenging to model. This paper introduces several new modeling techniques for nonprehensile object manipulation and their integration for robot learning and control. Other than a basic map of the object's state transitions, the proposed modeling framework includes a generic object model that can help a learning agent infer manipulations that have not been demonstrated, a contact-based skill model that can semantically describe nonprehensile manipulation skills, and a motion model that can incrementally identify patterns from crowdsourced and constantly collected data. Examples and experiment results are given to explain and validate the proposed methods.
AB - Advanced hand skills for object manipulation can greatly enhance the physical capability of robots in a variety of applications. Models that can comprehensively and ubiquitously capture semantic information from the demonstration data are essential for robots to learn skills and act autonomously. Compared to object manipulation with firm grasping, nonprehensile manipulation skills can significantly extend the manipulation ability of robots but are also challenging to model. This paper introduces several new modeling techniques for nonprehensile object manipulation and their integration for robot learning and control. Other than a basic map of the object's state transitions, the proposed modeling framework includes a generic object model that can help a learning agent infer manipulations that have not been demonstrated, a contact-based skill model that can semantically describe nonprehensile manipulation skills, and a motion model that can incrementally identify patterns from crowdsourced and constantly collected data. Examples and experiment results are given to explain and validate the proposed methods.
KW - modeling for planning and control
KW - nonprehensile manipulation
KW - robot physical intelligence
UR - http://www.scopus.com/inward/record.url?scp=85203287564&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85203287564&partnerID=8YFLogxK
U2 - 10.1109/AIM55361.2024.10637220
DO - 10.1109/AIM55361.2024.10637220
M3 - Conference contribution
AN - SCOPUS:85203287564
T3 - IEEE/ASME International Conference on Advanced Intelligent Mechatronics, AIM
SP - 184
EP - 191
BT - 2024 IEEE International Conference on Advanced Intelligent Mechatronics, AIM 2024
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2024 IEEE/ASME International Conference on Advanced Intelligent Mechatronics, AIM 2024
Y2 - 15 July 2024 through 19 July 2024
ER -