A personalized recommendation algorithm based on Hadoop

Hao Huang, Jianqing Huang, Sotirios Ziavras, Yaojie Lu

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Scopus citations

Abstract

BDM-NBI algorithm is proposed at this paper. It focuses on the analysis of a personalized recommendation algorithm that utilizes a weighted bipartite graph suitable for processing big data. Our algorithm adopts bipartite graph partitioning using a vertex separator method that partitions a high-dimensional sparse matrix into a pseudo-block based diagonal matrix. Then, the recommendation algorithm analyzes all weighted sub-matrices in parallel. We produce the global recommendation weighted matrix by merging all of the sub-matrices in parallel. Experiments with Hadoop show that our algorithm has good approximation for small matrices and excellent scalability.

Original languageEnglish (US)
Title of host publicationICEIEC 2015 - Proceedings of 2015 IEEE 5th International Conference on Electronics Information and Emergency Communication
EditorsVincent Tam, Zhu Wei, Li Wenzheng
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages406-409
Number of pages4
ISBN (Electronic)9781479972838
DOIs
StatePublished - Sep 29 2015
Event5th IEEE International Conference on Electronics Information and Emergency Communication, ICEIEC 2015 - Beijing, China
Duration: May 14 2015May 16 2015

Publication series

NameICEIEC 2015 - Proceedings of 2015 IEEE 5th International Conference on Electronics Information and Emergency Communication

Other

Other5th IEEE International Conference on Electronics Information and Emergency Communication, ICEIEC 2015
CountryChina
CityBeijing
Period5/14/155/16/15

All Science Journal Classification (ASJC) codes

  • Information Systems
  • Communication
  • Computer Networks and Communications

Keywords

  • Big Data
  • Parallelization
  • Personalized Recommendation
  • Sparse Matrix Partition

Fingerprint Dive into the research topics of 'A personalized recommendation algorithm based on Hadoop'. Together they form a unique fingerprint.

Cite this