Feature Selection Inspired Classifier Ensemble Reduction

Ren Diao, Fei Chao, Taoxin Peng, Neal Snooke, Qiang Shen

Research output: Contribution to journalArticlepeer-review

80 Citations (Scopus)
394 Downloads (Pure)

Abstract

Classifier ensembles constitute one of the main research directions in machine learning and data mining. The use of multiple classifiers generally allows better predictive performance than that achievable with a single model. Several approaches exist in the literature that provide means to construct and aggregate such ensembles. However, these ensemble systems contain redundant members that, if removed, may further increase group diversity and produce better results. Smaller ensembles also relax the memory and storage requirements, reducing system's run-time overhead while improving overall efficiency. This paper extends the ideas developed for feature selection problems to support classifier ensemble reduction, by transforming ensemble predictions into training samples, and treating classifiers as features. Also, the global heuristic harmony search is used to select a reduced subset of such artificial features, while attempting to maximize the feature subset evaluation. The resulting technique is systematically evaluated using high dimensional and large sized benchmark datasets, showing a superior classification performance against both original, unreduced ensembles, and randomly formed subsets.
Original languageEnglish
Pages (from-to)1259-1268
JournalIEEE Transactions on Cybernetics
Volume44
Issue number8
DOIs
Publication statusPublished - 15 Jul 2014

Fingerprint

Dive into the research topics of 'Feature Selection Inspired Classifier Ensemble Reduction'. Together they form a unique fingerprint.

Cite this