TY - GEN
T1 - A Fast Non-Linear Coupled Tensor Completion Algorithm for Financial Data Integration and Imputation
AU - Zhou, Dan
AU - Uddin, Ajim
AU - Shang, Zuofeng
AU - Sylla, Cheickna
AU - Tao, Xinyuan
AU - Yu, Dantong
N1 - Publisher Copyright:
© 2023 Owner/Author.
PY - 2023/11/27
Y1 - 2023/11/27
N2 - Missing data imputation is crucial in finance to ensure accurate financial analysis, risk management, investment strategies, and other financial applications. Recently, tensor factorization and completion have gained momentum in many finance data imputation applications, primarily due to recent breakthroughs in applying deep neural networks for nonlinear tensor analysis. However, one limitation of these approaches is that they are prone to overfitting sparse tensors that contain only a small number of observations. This paper focuses on learning highly reliable embedding for the tensor imputation problem and applies orthogonal regularizations for tensor factorization, reconstruction, and completion. The proposed neural network architecture for sparse tensors, called "RegTensor", includes multiple components: an embedding learning module for each tensor order, MLP (multilayer perception) to model nonlinear interactions among embeddings, and a regularization module to minimize overfitting problems due to the large tensor rank. Our algorithm is efficient in factorizing both single and multiple tensors (coupled tensor factorization) without incurring high training and optimization costs. We have applied this algorithm in a variety of practical scenarios, including the imputation of bond characteristics and financial analyst EPS forecast data. Experimental results demonstrate its superiority with significant performance improvements: 40%-74% better than linear tensor completion models and 2%-52% better than the state-of-the-art nonlinear models.
AB - Missing data imputation is crucial in finance to ensure accurate financial analysis, risk management, investment strategies, and other financial applications. Recently, tensor factorization and completion have gained momentum in many finance data imputation applications, primarily due to recent breakthroughs in applying deep neural networks for nonlinear tensor analysis. However, one limitation of these approaches is that they are prone to overfitting sparse tensors that contain only a small number of observations. This paper focuses on learning highly reliable embedding for the tensor imputation problem and applies orthogonal regularizations for tensor factorization, reconstruction, and completion. The proposed neural network architecture for sparse tensors, called "RegTensor", includes multiple components: an embedding learning module for each tensor order, MLP (multilayer perception) to model nonlinear interactions among embeddings, and a regularization module to minimize overfitting problems due to the large tensor rank. Our algorithm is efficient in factorizing both single and multiple tensors (coupled tensor factorization) without incurring high training and optimization costs. We have applied this algorithm in a variety of practical scenarios, including the imputation of bond characteristics and financial analyst EPS forecast data. Experimental results demonstrate its superiority with significant performance improvements: 40%-74% better than linear tensor completion models and 2%-52% better than the state-of-the-art nonlinear models.
KW - Coupled Tensor Decomposition
KW - FinTech.
KW - Non-linear Tensor Factorization
KW - Sparse Tensor Completion
UR - http://www.scopus.com/inward/record.url?scp=85179847932&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85179847932&partnerID=8YFLogxK
U2 - 10.1145/3604237.3626899
DO - 10.1145/3604237.3626899
M3 - Conference contribution
AN - SCOPUS:85179847932
T3 - ICAIF 2023 - 4th ACM International Conference on AI in Finance
SP - 409
EP - 417
BT - ICAIF 2023 - 4th ACM International Conference on AI in Finance
PB - Association for Computing Machinery, Inc
T2 - 4th ACM International Conference on AI in Finance, ICAIF 2023
Y2 - 27 November 2023 through 29 November 2023
ER -