Loading…
Feature-subspace aggregating: ensembles for stable and unstable learners
This paper introduces a new ensemble approach, Feature-Subspace Aggregating (Feating), which builds local models instead of global models. Feating is a generic ensemble approach that can enhance the predictive performance of both stable and unstable learners. In contrast, most existing ensemble appr...
Saved in:
Published in: | Machine learning 2011-03, Vol.82 (3), p.375-397 |
---|---|
Main Authors: | , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
cited_by | cdi_FETCH-LOGICAL-c454t-1879e1fc4c7f1f5b3a9deb2b9d75d84df8af218297ea1c6655adec6304848a373 |
---|---|
cites | cdi_FETCH-LOGICAL-c454t-1879e1fc4c7f1f5b3a9deb2b9d75d84df8af218297ea1c6655adec6304848a373 |
container_end_page | 397 |
container_issue | 3 |
container_start_page | 375 |
container_title | Machine learning |
container_volume | 82 |
creator | Ting, Kai Ming Wells, Jonathan R. Tan, Swee Chuan Teng, Shyh Wei Webb, Geoffrey I. |
description | This paper introduces a new ensemble approach, Feature-Subspace Aggregating (Feating), which builds local models instead of global models. Feating is a generic ensemble approach that can enhance the predictive performance of both stable and unstable learners. In contrast, most existing ensemble approaches can improve the predictive performance of unstable learners only. Our analysis shows that the new approach reduces the execution time to generate a model in an ensemble through an increased level of localisation in Feating. Our empirical evaluation shows that Feating performs significantly better than Boosting, Random Subspace and Bagging in terms of predictive accuracy, when a stable learner SVM is used as the base learner. The speed up achieved by Feating makes feasible SVM ensembles that would otherwise be infeasible for large data sets. When SVM is the preferred base learner, we show that Feating SVM performs better than Boosting decision trees and Random Forests. We further demonstrate that Feating also substantially reduces the error of another stable learner, k-nearest neighbour, and an unstable learner, decision tree. |
doi_str_mv | 10.1007/s10994-010-5224-5 |
format | article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_854984313</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2282796971</sourcerecordid><originalsourceid>FETCH-LOGICAL-c454t-1879e1fc4c7f1f5b3a9deb2b9d75d84df8af218297ea1c6655adec6304848a373</originalsourceid><addsrcrecordid>eNp1kEFOwzAQRS0EEqVwAHYREkuD7diJzQ5VFCpVYgNra-KMo1apU-xkwW04CycjkApWrGZG8-fP1yPkkrMbzlh5mzgzRlLGGVVCSKqOyIyrMqdMFeqYzJjWihZcqFNyltKWMSYKXczIaonQDxFpGqq0B4cZNE3EBvpNaO4yDAl3VYsp813MUg9jn0GoPz-GcJhahBgwpnNy4qFNeHGoc_K6fHhZPNH18-Nqcb-mTirZU65Lg9w76UrPvapyMDVWojJ1qWota6_BC66FKRG4KwqloEZX5ExqqSEv8zm5mnz3sXsbMPV22w0xjC-tVtJomfN8FPFJ5GKXUkRv93Gzg_huObPfwOwEzI7A7Dcwq8ab64MxJAetjxDcJv0eCsmF4T8BxKRL4yo0GP8C_G_-BRrGe-k</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>854984313</pqid></control><display><type>article</type><title>Feature-subspace aggregating: ensembles for stable and unstable learners</title><source>Springer Nature</source><creator>Ting, Kai Ming ; Wells, Jonathan R. ; Tan, Swee Chuan ; Teng, Shyh Wei ; Webb, Geoffrey I.</creator><creatorcontrib>Ting, Kai Ming ; Wells, Jonathan R. ; Tan, Swee Chuan ; Teng, Shyh Wei ; Webb, Geoffrey I.</creatorcontrib><description>This paper introduces a new ensemble approach, Feature-Subspace Aggregating (Feating), which builds local models instead of global models. Feating is a generic ensemble approach that can enhance the predictive performance of both stable and unstable learners. In contrast, most existing ensemble approaches can improve the predictive performance of unstable learners only. Our analysis shows that the new approach reduces the execution time to generate a model in an ensemble through an increased level of localisation in Feating. Our empirical evaluation shows that Feating performs significantly better than Boosting, Random Subspace and Bagging in terms of predictive accuracy, when a stable learner SVM is used as the base learner. The speed up achieved by Feating makes feasible SVM ensembles that would otherwise be infeasible for large data sets. When SVM is the preferred base learner, we show that Feating SVM performs better than Boosting decision trees and Random Forests. We further demonstrate that Feating also substantially reduces the error of another stable learner, k-nearest neighbour, and an unstable learner, decision tree.</description><identifier>ISSN: 0885-6125</identifier><identifier>EISSN: 1573-0565</identifier><identifier>DOI: 10.1007/s10994-010-5224-5</identifier><language>eng</language><publisher>Boston: Springer US</publisher><subject>Applied sciences ; Artificial Intelligence ; Computer Science ; Computer science; control theory; systems ; Control ; Data processing. List processing. Character string processing ; Exact sciences and technology ; Mechatronics ; Memory organisation. Data processing ; Natural Language Processing (NLP) ; Robotics ; Simulation and Modeling ; Software</subject><ispartof>Machine learning, 2011-03, Vol.82 (3), p.375-397</ispartof><rights>The Author(s) 2010</rights><rights>2015 INIST-CNRS</rights><rights>The Author(s) 2011</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c454t-1879e1fc4c7f1f5b3a9deb2b9d75d84df8af218297ea1c6655adec6304848a373</citedby><cites>FETCH-LOGICAL-c454t-1879e1fc4c7f1f5b3a9deb2b9d75d84df8af218297ea1c6655adec6304848a373</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,776,780,27901,27902</link.rule.ids><backlink>$$Uhttp://pascal-francis.inist.fr/vibad/index.php?action=getRecordDetail&idt=24129137$$DView record in Pascal Francis$$Hfree_for_read</backlink></links><search><creatorcontrib>Ting, Kai Ming</creatorcontrib><creatorcontrib>Wells, Jonathan R.</creatorcontrib><creatorcontrib>Tan, Swee Chuan</creatorcontrib><creatorcontrib>Teng, Shyh Wei</creatorcontrib><creatorcontrib>Webb, Geoffrey I.</creatorcontrib><title>Feature-subspace aggregating: ensembles for stable and unstable learners</title><title>Machine learning</title><addtitle>Mach Learn</addtitle><description>This paper introduces a new ensemble approach, Feature-Subspace Aggregating (Feating), which builds local models instead of global models. Feating is a generic ensemble approach that can enhance the predictive performance of both stable and unstable learners. In contrast, most existing ensemble approaches can improve the predictive performance of unstable learners only. Our analysis shows that the new approach reduces the execution time to generate a model in an ensemble through an increased level of localisation in Feating. Our empirical evaluation shows that Feating performs significantly better than Boosting, Random Subspace and Bagging in terms of predictive accuracy, when a stable learner SVM is used as the base learner. The speed up achieved by Feating makes feasible SVM ensembles that would otherwise be infeasible for large data sets. When SVM is the preferred base learner, we show that Feating SVM performs better than Boosting decision trees and Random Forests. We further demonstrate that Feating also substantially reduces the error of another stable learner, k-nearest neighbour, and an unstable learner, decision tree.</description><subject>Applied sciences</subject><subject>Artificial Intelligence</subject><subject>Computer Science</subject><subject>Computer science; control theory; systems</subject><subject>Control</subject><subject>Data processing. List processing. Character string processing</subject><subject>Exact sciences and technology</subject><subject>Mechatronics</subject><subject>Memory organisation. Data processing</subject><subject>Natural Language Processing (NLP)</subject><subject>Robotics</subject><subject>Simulation and Modeling</subject><subject>Software</subject><issn>0885-6125</issn><issn>1573-0565</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2011</creationdate><recordtype>article</recordtype><recordid>eNp1kEFOwzAQRS0EEqVwAHYREkuD7diJzQ5VFCpVYgNra-KMo1apU-xkwW04CycjkApWrGZG8-fP1yPkkrMbzlh5mzgzRlLGGVVCSKqOyIyrMqdMFeqYzJjWihZcqFNyltKWMSYKXczIaonQDxFpGqq0B4cZNE3EBvpNaO4yDAl3VYsp813MUg9jn0GoPz-GcJhahBgwpnNy4qFNeHGoc_K6fHhZPNH18-Nqcb-mTirZU65Lg9w76UrPvapyMDVWojJ1qWota6_BC66FKRG4KwqloEZX5ExqqSEv8zm5mnz3sXsbMPV22w0xjC-tVtJomfN8FPFJ5GKXUkRv93Gzg_huObPfwOwEzI7A7Dcwq8ab64MxJAetjxDcJv0eCsmF4T8BxKRL4yo0GP8C_G_-BRrGe-k</recordid><startdate>20110301</startdate><enddate>20110301</enddate><creator>Ting, Kai Ming</creator><creator>Wells, Jonathan R.</creator><creator>Tan, Swee Chuan</creator><creator>Teng, Shyh Wei</creator><creator>Webb, Geoffrey I.</creator><general>Springer US</general><general>Springer</general><general>Springer Nature B.V</general><scope>C6C</scope><scope>IQODW</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>3V.</scope><scope>7SC</scope><scope>7XB</scope><scope>88I</scope><scope>8AL</scope><scope>8AO</scope><scope>8FD</scope><scope>8FE</scope><scope>8FG</scope><scope>8FK</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>GNUQQ</scope><scope>HCIFZ</scope><scope>JQ2</scope><scope>K7-</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>M0N</scope><scope>M2P</scope><scope>P5Z</scope><scope>P62</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>Q9U</scope></search><sort><creationdate>20110301</creationdate><title>Feature-subspace aggregating: ensembles for stable and unstable learners</title><author>Ting, Kai Ming ; Wells, Jonathan R. ; Tan, Swee Chuan ; Teng, Shyh Wei ; Webb, Geoffrey I.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c454t-1879e1fc4c7f1f5b3a9deb2b9d75d84df8af218297ea1c6655adec6304848a373</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2011</creationdate><topic>Applied sciences</topic><topic>Artificial Intelligence</topic><topic>Computer Science</topic><topic>Computer science; control theory; systems</topic><topic>Control</topic><topic>Data processing. List processing. Character string processing</topic><topic>Exact sciences and technology</topic><topic>Mechatronics</topic><topic>Memory organisation. Data processing</topic><topic>Natural Language Processing (NLP)</topic><topic>Robotics</topic><topic>Simulation and Modeling</topic><topic>Software</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Ting, Kai Ming</creatorcontrib><creatorcontrib>Wells, Jonathan R.</creatorcontrib><creatorcontrib>Tan, Swee Chuan</creatorcontrib><creatorcontrib>Teng, Shyh Wei</creatorcontrib><creatorcontrib>Webb, Geoffrey I.</creatorcontrib><collection>Springer Open Access</collection><collection>Pascal-Francis</collection><collection>CrossRef</collection><collection>ProQuest Central (Corporate)</collection><collection>Computer and Information Systems Abstracts</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Science Database (Alumni Edition)</collection><collection>Computing Database (Alumni Edition)</collection><collection>ProQuest Pharma Collection</collection><collection>Technology Research Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>ProQuest Central (Alumni)</collection><collection>ProQuest Central UK/Ireland</collection><collection>Advanced Technologies & Aerospace Collection</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central</collection><collection>ProQuest Central Student</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Computer Science Collection</collection><collection>Computer Science Database</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>Computing Database</collection><collection>ProQuest Science Journals</collection><collection>ProQuest advanced technologies & aerospace journals</collection><collection>ProQuest Advanced Technologies & Aerospace Collection</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>ProQuest Central Basic</collection><jtitle>Machine learning</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Ting, Kai Ming</au><au>Wells, Jonathan R.</au><au>Tan, Swee Chuan</au><au>Teng, Shyh Wei</au><au>Webb, Geoffrey I.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Feature-subspace aggregating: ensembles for stable and unstable learners</atitle><jtitle>Machine learning</jtitle><stitle>Mach Learn</stitle><date>2011-03-01</date><risdate>2011</risdate><volume>82</volume><issue>3</issue><spage>375</spage><epage>397</epage><pages>375-397</pages><issn>0885-6125</issn><eissn>1573-0565</eissn><abstract>This paper introduces a new ensemble approach, Feature-Subspace Aggregating (Feating), which builds local models instead of global models. Feating is a generic ensemble approach that can enhance the predictive performance of both stable and unstable learners. In contrast, most existing ensemble approaches can improve the predictive performance of unstable learners only. Our analysis shows that the new approach reduces the execution time to generate a model in an ensemble through an increased level of localisation in Feating. Our empirical evaluation shows that Feating performs significantly better than Boosting, Random Subspace and Bagging in terms of predictive accuracy, when a stable learner SVM is used as the base learner. The speed up achieved by Feating makes feasible SVM ensembles that would otherwise be infeasible for large data sets. When SVM is the preferred base learner, we show that Feating SVM performs better than Boosting decision trees and Random Forests. We further demonstrate that Feating also substantially reduces the error of another stable learner, k-nearest neighbour, and an unstable learner, decision tree.</abstract><cop>Boston</cop><pub>Springer US</pub><doi>10.1007/s10994-010-5224-5</doi><tpages>23</tpages><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 0885-6125 |
ispartof | Machine learning, 2011-03, Vol.82 (3), p.375-397 |
issn | 0885-6125 1573-0565 |
language | eng |
recordid | cdi_proquest_journals_854984313 |
source | Springer Nature |
subjects | Applied sciences Artificial Intelligence Computer Science Computer science control theory systems Control Data processing. List processing. Character string processing Exact sciences and technology Mechatronics Memory organisation. Data processing Natural Language Processing (NLP) Robotics Simulation and Modeling Software |
title | Feature-subspace aggregating: ensembles for stable and unstable learners |
url | http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-07T15%3A45%3A14IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Feature-subspace%20aggregating:%20ensembles%20for%20stable%20and%C2%A0unstable%20learners&rft.jtitle=Machine%20learning&rft.au=Ting,%20Kai%20Ming&rft.date=2011-03-01&rft.volume=82&rft.issue=3&rft.spage=375&rft.epage=397&rft.pages=375-397&rft.issn=0885-6125&rft.eissn=1573-0565&rft_id=info:doi/10.1007/s10994-010-5224-5&rft_dat=%3Cproquest_cross%3E2282796971%3C/proquest_cross%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c454t-1879e1fc4c7f1f5b3a9deb2b9d75d84df8af218297ea1c6655adec6304848a373%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=854984313&rft_id=info:pmid/&rfr_iscdi=true |