These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.
Pubmed for Handhelds
PUBMED FOR HANDHELDS
Search MEDLINE/PubMed
Title: Boosting random subspace method. Author: García-Pedrajas N, Ortiz-Boyer D. Journal: Neural Netw; 2008 Nov; 21(9):1344-62. PubMed ID: 18272334. Abstract: In this paper we propose a boosting approach to random subspace method (RSM) to achieve an improved performance and avoid some of the major drawbacks of RSM. RSM is a successful method for classification. However, the random selection of inputs, its source of success, can also be a major problem. For several problems some of the selected subspaces may lack the discriminant ability to separate the different classes. These subspaces produce poor classifiers that harm the performance of the ensemble. Additionally, boosting RSM would also be an interesting approach for improving its performance. Nevertheless, the application of the two methods together, boosting and RSM, achieves poor results, worse than the results of each method separately. In this work, we propose a new approach for combining RSM and boosting. Instead of obtaining random subspaces, we search subspaces that optimize the weighted classification error given by the boosting algorithm, and then the new classifier added to the ensemble is trained using the obtained subspace. An additional advantage of the proposed methodology is that it can be used with any classifier, including those, such as k nearest neighbor classifiers, that cannot use boosting methods easily. The proposed approach is compared with standard ADABoost and RSM showing an improved performance on a large set of 45 problems from the UCI Machine Learning Repository. An additional study of the effect of noise on the labels of the training instances shows that the less aggressive versions of the proposed methodology are more robust than ADABoost in the presence of noise.[Abstract] [Full Text] [Related] [New Search]