Effects of Extended Stochastic Gradient Descent Algorithms on Improving Latent Factor-Based Recommender Systems

Xin Luo, Mengchu Zhou

Research output: Contribution to journalArticlepeer-review

15 Scopus citations

Abstract

High-dimensional and sparse (HiDS) matrices from recommender systems contain various useful patterns. A latent factor (LF) analysis is highly efficient in grasping these patterns. Stochastic gradient descent (SGD) is a widely adopted algorithm to train an LF model. Can its extensions be capable of further improving an LF models' convergence rate and prediction accuracy for missing data? To answer this question, this work selects two of representative extended SGD algorithms to propose two novel LF models. Experimental results from two HiDS matrices generated by real recommender systems show that compared standard SGD, extended SGD algorithms enable an LF model to achieve a higher prediction accuracy for missing data of an HiDS matrix, a faster convergence rate, and a larger model diversity.

Original languageEnglish (US)
Article number8607099
Pages (from-to)618-624
Number of pages7
JournalIEEE Robotics and Automation Letters
Volume4
Issue number2
DOIs
StatePublished - Apr 2019

All Science Journal Classification (ASJC) codes

  • Control and Systems Engineering
  • Biomedical Engineering
  • Human-Computer Interaction
  • Mechanical Engineering
  • Computer Vision and Pattern Recognition
  • Computer Science Applications
  • Control and Optimization
  • Artificial Intelligence

Keywords

  • AI-based methods
  • Big data in robotics and automation
  • machine learning

Fingerprint

Dive into the research topics of 'Effects of Extended Stochastic Gradient Descent Algorithms on Improving Latent Factor-Based Recommender Systems'. Together they form a unique fingerprint.

Cite this