TY - JOUR
T1 - Adjusting Learning Depth in Nonnegative Latent Factorization of Tensors for Accurately Modeling Temporal Patterns in Dynamic QoS Data
AU - Luo, Xin
AU - Chen, Minzhi
AU - Wu, Hao
AU - Liu, Zhigang
AU - Yuan, Huaqiang
AU - Zhou, Mengchu
AU - Yuan, Huaqiang
N1 - Funding Information:
This work was supported in part by the National Natural Science Foundation of China under Grant 61772493, in part by the Guangdong Province Universities and College Pearl River Scholar Funded Scheme (2019), and in part by the Natural Science Foundation of Chongqing (China) under Grant cstc2019jcyjjqX0013.*%blankline%*
Publisher Copyright:
© 2021 Institute of Electrical and Electronics Engineers Inc.. All rights reserved.
PY - 2021/10/1
Y1 - 2021/10/1
N2 - A nonnegative latent factorization of tensors (NLFT) model precisely represents the temporal patterns hidden in multichannel data emerging from various applications. It often adopts a single latent factor-dependent, nonnegative and multiplicative update on tensor (SLF-NMUT) algorithm. However, learning depth in this algorithm is not adjustable, resulting in frequent training fluctuation or poor model convergence caused by overshooting. To address this issue, this study carefully investigates the connections between the performance of an NLFT model and its learning depth via SLF-NMUT to present a joint learning-depth-adjusting scheme for it. Based on this scheme, a Depth-adjusted Multiplicative Update on tensor algorithm is innovatively proposed, thereby achieving a novel depth-adjusted nonnegative latent-factorization-of-tensors (DNL) model. Empirical studies on two industrial data sets demonstrate that compared with the state-of-the-art NLFT models, a DNL model achieves significant accuracy gain when performing missing data estimation on a high-dimensional and incomplete tensor with high efficiency. Note to Practitioners - Multichannel data are often encountered in various big-data-related applications. It is vital for a data analyzer to correctly capture the temporal patterns hidden in them for efficient knowledge acquisition and representation. This article focuses on analyzing temporal QoS data, which is a representative kind of multichannel data. To correctly extract their temporal patterns, an analyzer should correctly describe their nonnegativity. Such a purpose can be achieved by building a nonnegative latent factorization of tensors (NLFT) model relying on a single latent factor-dependent, nonnegative and multiplicative update on tensor (SLF-NMUT) algorithm. But its learning depth is not adjustable, making an NLFT model frequently suffer from severe fluctuations in its training error or even fail to converge. To address this issue, this study carefully investigates the learning rules for an NLFT model's decision parameters using an SLF-NMUT and proposes a joint learning-depth-adjusting scheme. This scheme manipulates the multiplicative terms in SLF-NMUT-based learning rules linearly and exponentially, thereby making the learning depth adjustable. Based on it, this study builds a novel depth-adjusted nonnegative latent-factorization-of-tensors (DNL) model. Compared with the existing NLFT models, a DNL model better represents multichannel data. It meets industrial needs well and can be used to achieve high performance in data analysis tasks like temporal-aware missing data estimation.
AB - A nonnegative latent factorization of tensors (NLFT) model precisely represents the temporal patterns hidden in multichannel data emerging from various applications. It often adopts a single latent factor-dependent, nonnegative and multiplicative update on tensor (SLF-NMUT) algorithm. However, learning depth in this algorithm is not adjustable, resulting in frequent training fluctuation or poor model convergence caused by overshooting. To address this issue, this study carefully investigates the connections between the performance of an NLFT model and its learning depth via SLF-NMUT to present a joint learning-depth-adjusting scheme for it. Based on this scheme, a Depth-adjusted Multiplicative Update on tensor algorithm is innovatively proposed, thereby achieving a novel depth-adjusted nonnegative latent-factorization-of-tensors (DNL) model. Empirical studies on two industrial data sets demonstrate that compared with the state-of-the-art NLFT models, a DNL model achieves significant accuracy gain when performing missing data estimation on a high-dimensional and incomplete tensor with high efficiency. Note to Practitioners - Multichannel data are often encountered in various big-data-related applications. It is vital for a data analyzer to correctly capture the temporal patterns hidden in them for efficient knowledge acquisition and representation. This article focuses on analyzing temporal QoS data, which is a representative kind of multichannel data. To correctly extract their temporal patterns, an analyzer should correctly describe their nonnegativity. Such a purpose can be achieved by building a nonnegative latent factorization of tensors (NLFT) model relying on a single latent factor-dependent, nonnegative and multiplicative update on tensor (SLF-NMUT) algorithm. But its learning depth is not adjustable, making an NLFT model frequently suffer from severe fluctuations in its training error or even fail to converge. To address this issue, this study carefully investigates the learning rules for an NLFT model's decision parameters using an SLF-NMUT and proposes a joint learning-depth-adjusting scheme. This scheme manipulates the multiplicative terms in SLF-NMUT-based learning rules linearly and exponentially, thereby making the learning depth adjustable. Based on it, this study builds a novel depth-adjusted nonnegative latent-factorization-of-tensors (DNL) model. Compared with the existing NLFT models, a DNL model better represents multichannel data. It meets industrial needs well and can be used to achieve high performance in data analysis tasks like temporal-aware missing data estimation.
KW - algorithm
KW - big data
KW - dynamics
KW - high-dimensional and incomplete (hdi) data
KW - machine learning
KW - missing data estimation
KW - multichannel data
KW - nonnegative latent factorization of tensors (nlft)
KW - quality of service (qos)
KW - temporal pattern
KW - web service
UR - http://www.scopus.com/inward/record.url?scp=85099597604&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85099597604&partnerID=8YFLogxK
U2 - 10.1109/TASE.2020.3040400
DO - 10.1109/TASE.2020.3040400
M3 - Article
AN - SCOPUS:85099597604
SN - 1545-5955
VL - 18
SP - 2142
EP - 2155
JO - IEEE Transactions on Automation Science and Engineering
JF - IEEE Transactions on Automation Science and Engineering
IS - 4
ER -