Loading…

Feature-subspace aggregating: ensembles for stable andunstable learners

This paper introduces a new ensemble approach, Feature-Subspace Aggregating (Feating), which builds local models instead of global models. Feating is a generic ensemble approach that can enhance the predictive performance of both stable and unstable learners. In contrast, most existing ensemble appr...

Full description

Saved in:
Bibliographic Details
Published in:Machine learning 2011-03, Vol.82 (3), p.375-397
Main Authors: Ting, Kai Ming, Wells, Jonathan R, Tan, Swee Chuan, Teng, Shyh Wei, Webb, Geoffrey I
Format: Article
Language:English
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by
cites
container_end_page 397
container_issue 3
container_start_page 375
container_title Machine learning
container_volume 82
creator Ting, Kai Ming
Wells, Jonathan R
Tan, Swee Chuan
Teng, Shyh Wei
Webb, Geoffrey I
description This paper introduces a new ensemble approach, Feature-Subspace Aggregating (Feating), which builds local models instead of global models. Feating is a generic ensemble approach that can enhance the predictive performance of both stable and unstable learners. In contrast, most existing ensemble approaches can improve the predictive performance of unstable learners only. Our analysis shows that the new approach reduces the execution time to generate a model in an ensemble through an increased level of localisation in Feating. Our empirical evaluation shows that Feating performs significantly better than Boosting, Random Subspace and Bagging in terms of predictive accuracy, when a stable learner SVM is used as the base learner. The speed up achieved by Feating makes feasible SVM ensembles that would otherwise be infeasible for large data sets. When SVM is the preferred base learner, we show that Feating SVM performs better than Boosting decision trees and Random Forests. We further demonstrate that Feating also substantially reduces the error of another stable learner, k-nearest neighbour, and an unstable learner, decision tree.
doi_str_mv 10.1007/s10994-010-5224-5
format article
fullrecord <record><control><sourceid>proquest</sourceid><recordid>TN_cdi_proquest_miscellaneous_1671261913</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>1671261913</sourcerecordid><originalsourceid>FETCH-proquest_miscellaneous_16712619133</originalsourceid><addsrcrecordid>eNqVyrFuwjAQgGELtRIp5QHYPHZxuXOwk7BWhD5A9-hCj4jKOOCL3x-GvECn_xt-pTYInwhQbQWhaXYGEIyzdmfcQhXoqtKA8-5FFVDXzni0bqneRP4AwPraF-rYMk05sZHcy41OrGkYEg80XeKw1xyFr31g0ecxaZnoaU3xN8fZgSlFTvKuXs8UhNdzV-qjPfx8fZtbGu-ZZequFzlxCBR5zNKhr9B6bLAs_7E-AJQgRt4</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>1671261913</pqid></control><display><type>article</type><title>Feature-subspace aggregating: ensembles for stable andunstable learners</title><source>Springer Link</source><creator>Ting, Kai Ming ; Wells, Jonathan R ; Tan, Swee Chuan ; Teng, Shyh Wei ; Webb, Geoffrey I</creator><creatorcontrib>Ting, Kai Ming ; Wells, Jonathan R ; Tan, Swee Chuan ; Teng, Shyh Wei ; Webb, Geoffrey I</creatorcontrib><description>This paper introduces a new ensemble approach, Feature-Subspace Aggregating (Feating), which builds local models instead of global models. Feating is a generic ensemble approach that can enhance the predictive performance of both stable and unstable learners. In contrast, most existing ensemble approaches can improve the predictive performance of unstable learners only. Our analysis shows that the new approach reduces the execution time to generate a model in an ensemble through an increased level of localisation in Feating. Our empirical evaluation shows that Feating performs significantly better than Boosting, Random Subspace and Bagging in terms of predictive accuracy, when a stable learner SVM is used as the base learner. The speed up achieved by Feating makes feasible SVM ensembles that would otherwise be infeasible for large data sets. When SVM is the preferred base learner, we show that Feating SVM performs better than Boosting decision trees and Random Forests. We further demonstrate that Feating also substantially reduces the error of another stable learner, k-nearest neighbour, and an unstable learner, decision tree.</description><identifier>ISSN: 0885-6125</identifier><identifier>EISSN: 1573-0565</identifier><identifier>DOI: 10.1007/s10994-010-5224-5</identifier><language>eng</language><subject>Bagging ; Construction ; Decision trees ; Empirical analysis ; Forests ; Machine learning ; Performance enhancement ; Support vector machines</subject><ispartof>Machine learning, 2011-03, Vol.82 (3), p.375-397</ispartof><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,776,780,27901,27902</link.rule.ids></links><search><creatorcontrib>Ting, Kai Ming</creatorcontrib><creatorcontrib>Wells, Jonathan R</creatorcontrib><creatorcontrib>Tan, Swee Chuan</creatorcontrib><creatorcontrib>Teng, Shyh Wei</creatorcontrib><creatorcontrib>Webb, Geoffrey I</creatorcontrib><title>Feature-subspace aggregating: ensembles for stable andunstable learners</title><title>Machine learning</title><description>This paper introduces a new ensemble approach, Feature-Subspace Aggregating (Feating), which builds local models instead of global models. Feating is a generic ensemble approach that can enhance the predictive performance of both stable and unstable learners. In contrast, most existing ensemble approaches can improve the predictive performance of unstable learners only. Our analysis shows that the new approach reduces the execution time to generate a model in an ensemble through an increased level of localisation in Feating. Our empirical evaluation shows that Feating performs significantly better than Boosting, Random Subspace and Bagging in terms of predictive accuracy, when a stable learner SVM is used as the base learner. The speed up achieved by Feating makes feasible SVM ensembles that would otherwise be infeasible for large data sets. When SVM is the preferred base learner, we show that Feating SVM performs better than Boosting decision trees and Random Forests. We further demonstrate that Feating also substantially reduces the error of another stable learner, k-nearest neighbour, and an unstable learner, decision tree.</description><subject>Bagging</subject><subject>Construction</subject><subject>Decision trees</subject><subject>Empirical analysis</subject><subject>Forests</subject><subject>Machine learning</subject><subject>Performance enhancement</subject><subject>Support vector machines</subject><issn>0885-6125</issn><issn>1573-0565</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2011</creationdate><recordtype>article</recordtype><recordid>eNqVyrFuwjAQgGELtRIp5QHYPHZxuXOwk7BWhD5A9-hCj4jKOOCL3x-GvECn_xt-pTYInwhQbQWhaXYGEIyzdmfcQhXoqtKA8-5FFVDXzni0bqneRP4AwPraF-rYMk05sZHcy41OrGkYEg80XeKw1xyFr31g0ecxaZnoaU3xN8fZgSlFTvKuXs8UhNdzV-qjPfx8fZtbGu-ZZequFzlxCBR5zNKhr9B6bLAs_7E-AJQgRt4</recordid><startdate>20110301</startdate><enddate>20110301</enddate><creator>Ting, Kai Ming</creator><creator>Wells, Jonathan R</creator><creator>Tan, Swee Chuan</creator><creator>Teng, Shyh Wei</creator><creator>Webb, Geoffrey I</creator><scope>7SC</scope><scope>8FD</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope></search><sort><creationdate>20110301</creationdate><title>Feature-subspace aggregating: ensembles for stable andunstable learners</title><author>Ting, Kai Ming ; Wells, Jonathan R ; Tan, Swee Chuan ; Teng, Shyh Wei ; Webb, Geoffrey I</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-proquest_miscellaneous_16712619133</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2011</creationdate><topic>Bagging</topic><topic>Construction</topic><topic>Decision trees</topic><topic>Empirical analysis</topic><topic>Forests</topic><topic>Machine learning</topic><topic>Performance enhancement</topic><topic>Support vector machines</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Ting, Kai Ming</creatorcontrib><creatorcontrib>Wells, Jonathan R</creatorcontrib><creatorcontrib>Tan, Swee Chuan</creatorcontrib><creatorcontrib>Teng, Shyh Wei</creatorcontrib><creatorcontrib>Webb, Geoffrey I</creatorcontrib><collection>Computer and Information Systems Abstracts</collection><collection>Technology Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><jtitle>Machine learning</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Ting, Kai Ming</au><au>Wells, Jonathan R</au><au>Tan, Swee Chuan</au><au>Teng, Shyh Wei</au><au>Webb, Geoffrey I</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Feature-subspace aggregating: ensembles for stable andunstable learners</atitle><jtitle>Machine learning</jtitle><date>2011-03-01</date><risdate>2011</risdate><volume>82</volume><issue>3</issue><spage>375</spage><epage>397</epage><pages>375-397</pages><issn>0885-6125</issn><eissn>1573-0565</eissn><abstract>This paper introduces a new ensemble approach, Feature-Subspace Aggregating (Feating), which builds local models instead of global models. Feating is a generic ensemble approach that can enhance the predictive performance of both stable and unstable learners. In contrast, most existing ensemble approaches can improve the predictive performance of unstable learners only. Our analysis shows that the new approach reduces the execution time to generate a model in an ensemble through an increased level of localisation in Feating. Our empirical evaluation shows that Feating performs significantly better than Boosting, Random Subspace and Bagging in terms of predictive accuracy, when a stable learner SVM is used as the base learner. The speed up achieved by Feating makes feasible SVM ensembles that would otherwise be infeasible for large data sets. When SVM is the preferred base learner, we show that Feating SVM performs better than Boosting decision trees and Random Forests. We further demonstrate that Feating also substantially reduces the error of another stable learner, k-nearest neighbour, and an unstable learner, decision tree.</abstract><doi>10.1007/s10994-010-5224-5</doi></addata></record>
fulltext fulltext
identifier ISSN: 0885-6125
ispartof Machine learning, 2011-03, Vol.82 (3), p.375-397
issn 0885-6125
1573-0565
language eng
recordid cdi_proquest_miscellaneous_1671261913
source Springer Link
subjects Bagging
Construction
Decision trees
Empirical analysis
Forests
Machine learning
Performance enhancement
Support vector machines
title Feature-subspace aggregating: ensembles for stable andunstable learners
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-07T17%3A39%3A30IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Feature-subspace%20aggregating:%20ensembles%20for%20stable%20andunstable%20learners&rft.jtitle=Machine%20learning&rft.au=Ting,%20Kai%20Ming&rft.date=2011-03-01&rft.volume=82&rft.issue=3&rft.spage=375&rft.epage=397&rft.pages=375-397&rft.issn=0885-6125&rft.eissn=1573-0565&rft_id=info:doi/10.1007/s10994-010-5224-5&rft_dat=%3Cproquest%3E1671261913%3C/proquest%3E%3Cgrp_id%3Ecdi_FETCH-proquest_miscellaneous_16712619133%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=1671261913&rft_id=info:pmid/&rfr_iscdi=true