TY - GEN
T1 - A Cognitive Digital Twin for Industry 5.0 Based on a Large Language Model Agent
AU - Lou, Shanhe
AU - Tan, Runjia
AU - Zhou, Mengchu
AU - Lv, Chen
N1 - Publisher Copyright:
© 2025 IEEE.
PY - 2025
Y1 - 2025
N2 - While digital twins have made significant strides in creating digital replicas of physical manufacturing systems, their cognitive capabilities remain inadequate to address the dynamic complexities in manufacturing, including process variations, environmental fluctuations, and human interactions. These limitations hinder their applicability in Industry 5.0, which emphasizes Self-X cognitive capabilities. This work proposes an improved five-dimensional framework for developing a cognitive digital twin (CDT) that adopts a Large Language Model (LLM) agent at its core. The LLM agent enhances domain-specific tasksolving capabilities through retrieval-augmented generation (RAG) and in-context learning. RAG compensates for the general LLM's limitations by utilizing external tool libraries and industrial knowledge graphs to establish context awareness, retrieve domain-specific knowledge, and convert human commands into sequential task plans via function calls. Incontext learning further enables the LLM agent to learn specific tasks based on contextual examples without retraining. It empowers CDT to address domain-specific challenges with efficiency, flexibility, and cost-effectiveness. The effectiveness of the proposed CDT is demonstrated in a lab-scale manufacturing unit, highlighting its ability to perform valid task planning and handle dynamic incidents, paving the way for more resilient manufacturing systems aligned with Industry 5.0 objectives.
AB - While digital twins have made significant strides in creating digital replicas of physical manufacturing systems, their cognitive capabilities remain inadequate to address the dynamic complexities in manufacturing, including process variations, environmental fluctuations, and human interactions. These limitations hinder their applicability in Industry 5.0, which emphasizes Self-X cognitive capabilities. This work proposes an improved five-dimensional framework for developing a cognitive digital twin (CDT) that adopts a Large Language Model (LLM) agent at its core. The LLM agent enhances domain-specific tasksolving capabilities through retrieval-augmented generation (RAG) and in-context learning. RAG compensates for the general LLM's limitations by utilizing external tool libraries and industrial knowledge graphs to establish context awareness, retrieve domain-specific knowledge, and convert human commands into sequential task plans via function calls. Incontext learning further enables the LLM agent to learn specific tasks based on contextual examples without retraining. It empowers CDT to address domain-specific challenges with efficiency, flexibility, and cost-effectiveness. The effectiveness of the proposed CDT is demonstrated in a lab-scale manufacturing unit, highlighting its ability to perform valid task planning and handle dynamic incidents, paving the way for more resilient manufacturing systems aligned with Industry 5.0 objectives.
KW - Cognitive digital twin
KW - Industrial knowledge graph
KW - Large language model agent
KW - Retrieval-augmented generation
KW - Task planning
UR - https://www.scopus.com/pages/publications/105017783259
UR - https://www.scopus.com/pages/publications/105017783259#tab=citedBy
U2 - 10.1109/ICHMS65439.2025.11154278
DO - 10.1109/ICHMS65439.2025.11154278
M3 - Conference contribution
AN - SCOPUS:105017783259
T3 - ICHMS 2025 - 5th IEEE International Conference on Human-Machine Systems: AI and Large Language Models: Transforming Human-Machine Interactions
SP - 223
EP - 228
BT - ICHMS 2025 - 5th IEEE International Conference on Human-Machine Systems
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 5th IEEE International Conference on Human-Machine Systems, ICHMS 2025
Y2 - 26 May 2025 through 28 May 2025
ER -