Top-k parametrized boost

Turki Turki, Muhammad Ihsan, Nouf Turki, Jie Zhang, Usman Roshan, Zhi Wei

Research output: Chapter in Book/Report/Conference proceedingConference contribution

5 Scopus citations

Abstract

Ensemble methods such as AdaBoost are popular machine learning methods that create highly accurate classifier by combining the predictions from several classifiers. We present a parametrized method of AdaBoost that we call Top-k Parametrized Boost. We evaluate our and other popular ensemble methods from a classification perspective on several real datasets. Our empirical study shows that our method gives the minimum average error with statistical significance on the datasets.

Original languageEnglish (US)
Title of host publicationMining Intelligence and Knowledge Exploration - 2nd International Conference, MIKE 2014, Proceedings
EditorsRajendra Prasath, Philip O’Reilly, Thangairulappan Kathirvalavakumar
PublisherSpringer Verlag
Pages91-98
Number of pages8
ISBN (Electronic)9783319138169
DOIs
StatePublished - 2014
Event2nd International Conference on Mining Intelligence and Knowledge Exploration, MIKE 2014 - Cork, Ireland
Duration: Dec 10 2014Dec 12 2014

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume8891
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Other

Other2nd International Conference on Mining Intelligence and Knowledge Exploration, MIKE 2014
Country/TerritoryIreland
CityCork
Period12/10/1412/12/14

All Science Journal Classification (ASJC) codes

  • Theoretical Computer Science
  • General Computer Science

Keywords

  • AdaBoost
  • Ensemble methods
  • statistical significance

Fingerprint

Dive into the research topics of 'Top-k parametrized boost'. Together they form a unique fingerprint.

Cite this