TY - GEN
T1 - Unconstrained Non-negative Factorization of High-dimensional and Sparse Matrices in Recommender Systems
AU - Luo, Xin
AU - Zhou, Mengchu
N1 - Publisher Copyright:
© 2018 IEEE.
PY - 2018/12/4
Y1 - 2018/12/4
N2 - Non-negativity is vital for latent factor models to preserve the important feature of most high-dimensional and sparse (HiDS) matrices, e.g., none of their entries is negative. With the consideration of non-negativity, the training process of a latent factor model should be specified to achieve constraints-incorporated learning schemes. However, such schemes are neither flexible nor extensible. This work investigates algorithms of inherently non-negative latent factor analysis, which separates non-negativity constraints from the training process. Based on a deep investigation into the learning objective of a non-negative latent factor model, we separate the desired latent factors from decision variables involved in the training process via a single-element-dependent mapping function that makes the output factors inherently non-negative. Then we theoretically prove that the resultant model is able to represent the original one effectively. As a result, we design a highly efficient algorithm to bring the Inherently Non-negative Latent Factor model into practice. Experimental results on three HiDS matrices from industrial recommender systems show that compared with state-of-the-art non-negative latent factor models, the proposed one is able to obtain advantage in prediction accuracy with comparable or higher computational efficiency. Moreover, such high performance is achieved through its unconstrained optimization process on the premise of fulfilling the non-negativity constraints. Hence, the proposed model is highly valuable for industrial applications required to handle HiDS matrices subject to non-negativity constraints.
AB - Non-negativity is vital for latent factor models to preserve the important feature of most high-dimensional and sparse (HiDS) matrices, e.g., none of their entries is negative. With the consideration of non-negativity, the training process of a latent factor model should be specified to achieve constraints-incorporated learning schemes. However, such schemes are neither flexible nor extensible. This work investigates algorithms of inherently non-negative latent factor analysis, which separates non-negativity constraints from the training process. Based on a deep investigation into the learning objective of a non-negative latent factor model, we separate the desired latent factors from decision variables involved in the training process via a single-element-dependent mapping function that makes the output factors inherently non-negative. Then we theoretically prove that the resultant model is able to represent the original one effectively. As a result, we design a highly efficient algorithm to bring the Inherently Non-negative Latent Factor model into practice. Experimental results on three HiDS matrices from industrial recommender systems show that compared with state-of-the-art non-negative latent factor models, the proposed one is able to obtain advantage in prediction accuracy with comparable or higher computational efficiency. Moreover, such high performance is achieved through its unconstrained optimization process on the premise of fulfilling the non-negativity constraints. Hence, the proposed model is highly valuable for industrial applications required to handle HiDS matrices subject to non-negativity constraints.
UR - http://www.scopus.com/inward/record.url?scp=85059973531&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85059973531&partnerID=8YFLogxK
U2 - 10.1109/COASE.2018.8560481
DO - 10.1109/COASE.2018.8560481
M3 - Conference contribution
AN - SCOPUS:85059973531
T3 - IEEE International Conference on Automation Science and Engineering
SP - 1406
EP - 1413
BT - 2018 IEEE 14th International Conference on Automation Science and Engineering, CASE 2018
PB - IEEE Computer Society
T2 - 14th IEEE International Conference on Automation Science and Engineering, CASE 2018
Y2 - 20 August 2018 through 24 August 2018
ER -