Abstract
Ensemble learning via manipulating the training set is an effective technique for improving classification accuracy. In this work, we investigate the strategy how to combine learning set resampling method and random subspace method applied in high-dimensional domains. We propose a new procedure, Bag of Little Bootstraps on Features (BLBF), which works by combining the results of bootstrapping multiple feature subsets of the original dataset using the random subspace method. Our empirical experiments on various high-dimensional datasets demonstrate that our proposed approach outperforms the state-of-the-art instance-based resampling learning algorithm BLB and its two relevant variants, in terms of classification performance. In addition, we also investigate the effect of hyperparameters on classification performance, which shows that the parameters can be easily set while maintaining a good performance.
Get full access to this article
View all access options for this article.
