Latent Factor-Based Recommenders Relying on Extended Stochastic Gradient Descent Algorithms

Xin Luo, Dexian Wang, Meng Chu Zhou, Huaqiang Yuan

Research output: Contribution to journalArticlepeer-review

33 Scopus citations

Abstract

High-dimensional and sparse (HiDS) matrices generated by recommender systems contain rich knowledge regarding various desired patterns like users' potential preferences and community tendency. Latent factor (LF) analysis proves to be highly efficient in extracting such knowledge from an HiDS matrix efficiently. Stochastic gradient descent (SGD) is a highly efficient algorithm for building an LF model. However, current LF models mostly adopt a standard SGD algorithm. Can SGD be extended from various aspects in order to improve the resultant models' convergence rate and prediction accuracy for missing data? Are such SGD extensions compatible with an LF model? To answer them, this paper carefully investigates eight extended SGD algorithms to propose eight novel LF models. Experimental results on two HiDS matrices generated by real recommender systems show that compared with an LF model with a standard SGD algorithm, an LF model with extended ones can achieve: 1) higher prediction accuracy for missing data; 2) faster convergence rate; and 3) model diversity.

Original languageEnglish (US)
Article number8600762
Pages (from-to)916-926
Number of pages11
JournalIEEE Transactions on Systems, Man, and Cybernetics: Systems
Volume51
Issue number2
DOIs
StatePublished - Feb 2021

All Science Journal Classification (ASJC) codes

  • Software
  • Control and Systems Engineering
  • Human-Computer Interaction
  • Computer Science Applications
  • Electrical and Electronic Engineering

Keywords

  • Big data
  • bi-linear
  • collaborative filtering (CF)
  • high-dimensional and sparse (HiDS) matrix
  • industry
  • latent factor (LF) analysis
  • missing data
  • recommender system

Fingerprint

Dive into the research topics of 'Latent Factor-Based Recommenders Relying on Extended Stochastic Gradient Descent Algorithms'. Together they form a unique fingerprint.

Cite this