TY - GEN
T1 - Core Matrix Regression and Prediction with Regularization
AU - Zhou, Dan
AU - Uddin, Ajim
AU - Shang, Zuofeng
AU - Sylla, Cheickna
AU - Yu, Dantong
N1 - Publisher Copyright:
© 2022 ACM.
PY - 2022/11/2
Y1 - 2022/11/2
N2 - Many finance time-series analyses often track a matrix of variables at each time and study their co-evolution over a long time. The matrix time series is overly sparse, involves complex interactions among latent matrix factors, and demands advanced models to extract dynamic temporal patterns from these interactions. This paper proposes a Core Matrix Regression with Regularization algorithm (CMRR) to capture spatiotemporal relations in sparse matrix-variate time series. The model decomposes each matrix into three factor matrices of row entities, column entities, and interactions between row entities and column entities, respectively. Subsequently, it applies recurrent neural networks on interaction matrices to extract temporal patterns. Given the sparse matrix, we design an element-wise orthogonal matrix factorization that leverages the Stochastic Gradient Descent (SGD) in a deep learning platform to overcome the challenge of the sparsity and large volume of complex data. The experiment confirms that combining orthogonal matrix factorization with recurrent neural networks is highly effective and outperforms existing graph neural networks and tensor-based time series prediction methods. We apply CMRR in three real-world financial applications: firm earning forecast, predicting firm fundamentals, and firm characteristics, and demonstrate its consistent performance superiority: reducing error by 23%-53% over other state-of-the-art high-dimensional time series prediction algorithms.
AB - Many finance time-series analyses often track a matrix of variables at each time and study their co-evolution over a long time. The matrix time series is overly sparse, involves complex interactions among latent matrix factors, and demands advanced models to extract dynamic temporal patterns from these interactions. This paper proposes a Core Matrix Regression with Regularization algorithm (CMRR) to capture spatiotemporal relations in sparse matrix-variate time series. The model decomposes each matrix into three factor matrices of row entities, column entities, and interactions between row entities and column entities, respectively. Subsequently, it applies recurrent neural networks on interaction matrices to extract temporal patterns. Given the sparse matrix, we design an element-wise orthogonal matrix factorization that leverages the Stochastic Gradient Descent (SGD) in a deep learning platform to overcome the challenge of the sparsity and large volume of complex data. The experiment confirms that combining orthogonal matrix factorization with recurrent neural networks is highly effective and outperforms existing graph neural networks and tensor-based time series prediction methods. We apply CMRR in three real-world financial applications: firm earning forecast, predicting firm fundamentals, and firm characteristics, and demonstrate its consistent performance superiority: reducing error by 23%-53% over other state-of-the-art high-dimensional time series prediction algorithms.
KW - Tensor algorithm
KW - matrix factorization
KW - matrix-variate time series prediction
KW - recurrent neural networks
UR - http://www.scopus.com/inward/record.url?scp=85142479188&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85142479188&partnerID=8YFLogxK
U2 - 10.1145/3533271.3561709
DO - 10.1145/3533271.3561709
M3 - Conference contribution
AN - SCOPUS:85142479188
T3 - Proceedings of the 3rd ACM International Conference on AI in Finance, ICAIF 2022
SP - 291
EP - 299
BT - Proceedings of the 3rd ACM International Conference on AI in Finance, ICAIF 2022
PB - Association for Computing Machinery, Inc
T2 - 3rd ACM International Conference on AI in Finance, ICAIF 2022
Y2 - 2 November 2022 through 4 November 2022
ER -