TY - JOUR
T1 - Machine Learning in Real-Time Internet of Things (IoT) Systems
T2 - A Survey
AU - Bian, Jiang
AU - Arafat, Abdullah Al
AU - Xiong, Haoyi
AU - Li, Jing
AU - Li, Li
AU - Chen, Hongyang
AU - Wang, Jun
AU - Dou, Dejing
AU - Guo, Zhishan
N1 - Funding Information:
This work was supported in part by the National Science Foundation, USA, under Grant CNS 1948457, Grant CNS 1850851, Grant PPoSS-2028481, and Grant OIA 1937833. The work of Jiang Bian and Haoyi Xiong was supported in part by the National Key Research and Development Program of China under Grant 2021ZD0110303.
Publisher Copyright:
© 2014 IEEE.
PY - 2022/6/1
Y1 - 2022/6/1
N2 - Over the last decade, machine learning (ML) and deep learning (DL) algorithms have significantly evolved and been employed in diverse applications, such as computer vision, natural language processing, automated speech recognition, etc. Real-time safety-critical embedded and Internet of Things (IoT) systems, such as autonomous driving systems, UAVs, drones, security robots, etc., heavily rely on ML/DL-based technologies, accelerated with the improvement of hardware technologies. The cost of a deadline (required time constraint) missed by ML/DL algorithms would be catastrophic in these safety-critical systems. However, ML/DL algorithm-based applications have more concerns about accuracy than strict time requirements. Accordingly, researchers from the real-time systems (RTSs) community address the strict timing requirements of ML/DL technologies to include in RTSs. This article will rigorously explore the state-of-the-art results emphasizing the strengths and weaknesses in ML/DL-based scheduling techniques, accuracy versus execution time tradeoff policies of ML algorithms, and security and privacy of learning-based algorithms in real-time IoT systems.
AB - Over the last decade, machine learning (ML) and deep learning (DL) algorithms have significantly evolved and been employed in diverse applications, such as computer vision, natural language processing, automated speech recognition, etc. Real-time safety-critical embedded and Internet of Things (IoT) systems, such as autonomous driving systems, UAVs, drones, security robots, etc., heavily rely on ML/DL-based technologies, accelerated with the improvement of hardware technologies. The cost of a deadline (required time constraint) missed by ML/DL algorithms would be catastrophic in these safety-critical systems. However, ML/DL algorithm-based applications have more concerns about accuracy than strict time requirements. Accordingly, researchers from the real-time systems (RTSs) community address the strict timing requirements of ML/DL technologies to include in RTSs. This article will rigorously explore the state-of-the-art results emphasizing the strengths and weaknesses in ML/DL-based scheduling techniques, accuracy versus execution time tradeoff policies of ML algorithms, and security and privacy of learning-based algorithms in real-time IoT systems.
KW - Deep learning (DL)
KW - Internet of Things (IoT)
KW - Machine learning (ML)
KW - Real-time systems (RTSs)
KW - Scheduling
UR - http://www.scopus.com/inward/record.url?scp=85127082088&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85127082088&partnerID=8YFLogxK
U2 - 10.1109/JIOT.2022.3161050
DO - 10.1109/JIOT.2022.3161050
M3 - Review article
AN - SCOPUS:85127082088
SN - 2327-4662
VL - 9
SP - 8364
EP - 8386
JO - IEEE Internet of Things Journal
JF - IEEE Internet of Things Journal
IS - 11
ER -