Loading…
Feature selection in MLPs and SVMs based on maximum output information
This paper presents feature selection algorithms for multilayer perceptrons (MLPs) and multiclass support vector machines (SVMs), using mutual information between class labels and classifier outputs, as an objective function. This objective function involves inexpensive computation of information me...
Saved in:
Published in: | IEEE transaction on neural networks and learning systems 2004-07, Vol.15 (4), p.937-948 |
---|---|
Main Authors: | , , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
cited_by | cdi_FETCH-LOGICAL-c510t-2b5fd9f51894dba85894047d35f9b064f8fc845182d0ac8321a6b8a046efffe13 |
---|---|
cites | cdi_FETCH-LOGICAL-c510t-2b5fd9f51894dba85894047d35f9b064f8fc845182d0ac8321a6b8a046efffe13 |
container_end_page | 948 |
container_issue | 4 |
container_start_page | 937 |
container_title | IEEE transaction on neural networks and learning systems |
container_volume | 15 |
creator | Sindhwani, V. Rakshit, S. Deodhare, D. Erdogmus, D. Principe, J.C. Niyogi, P. |
description | This paper presents feature selection algorithms for multilayer perceptrons (MLPs) and multiclass support vector machines (SVMs), using mutual information between class labels and classifier outputs, as an objective function. This objective function involves inexpensive computation of information measures only on discrete variables; provides immunity to prior class probabilities; and brackets the probability of error of the classifier. The maximum output information (MOI) algorithms employ this function for feature subset selection by greedy elimination and directed search. The output of the MOI algorithms is a feature subset of user-defined size and an associated trained classifier (MLP/SVM). These algorithms compare favorably with a number of other methods in terms of performance on various artificial and real-world data sets. |
doi_str_mv | 10.1109/TNN.2004.828772 |
format | article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_crossref_primary_10_1109_TNN_2004_828772</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>1310365</ieee_id><sourcerecordid>66934223</sourcerecordid><originalsourceid>FETCH-LOGICAL-c510t-2b5fd9f51894dba85894047d35f9b064f8fc845182d0ac8321a6b8a046efffe13</originalsourceid><addsrcrecordid>eNqFkU1Lw0AQhhdRbK2ePQgSPOgp7exndo9SrAptFaxel02yCylNUrMJ6L93QwsFD3qagfeZGYYHoUsMY4xBTVbL5ZgAsLEkMknIERpixXAMoOhx6IHxWBGSDNCZ92sAzDiIUzTAnAkMkg_RbGZN2zU28nZjs7aoq6ioosX81UemyqO3j4WPUuNtHoWkNF9F2ZVR3bXbrg2gq5vS9EPn6MSZjbcX-zpC77OH1fQpnr88Pk_v53HGMbQxSbnLleNYKpanRvJQgSU55U6lIJiTLpMsxCQHk0lKsBGpNMCEdc5ZTEfobrd329SfnfWtLguf2c3GVLbuvFaAhSIgSSBv_ySFUJQRQv8FiaSYJbS_ffMLXNddU4V3dX9SUgUsQJMdlDW19411etsUpWm-NQbdK9NBme6V6Z2yMHG9X9ulpc0P_N5RAK52QGGtPcQUAxWc_gALqZfg</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>920883904</pqid></control><display><type>article</type><title>Feature selection in MLPs and SVMs based on maximum output information</title><source>IEEE Xplore (Online service)</source><creator>Sindhwani, V. ; Rakshit, S. ; Deodhare, D. ; Erdogmus, D. ; Principe, J.C. ; Niyogi, P.</creator><creatorcontrib>Sindhwani, V. ; Rakshit, S. ; Deodhare, D. ; Erdogmus, D. ; Principe, J.C. ; Niyogi, P.</creatorcontrib><description>This paper presents feature selection algorithms for multilayer perceptrons (MLPs) and multiclass support vector machines (SVMs), using mutual information between class labels and classifier outputs, as an objective function. This objective function involves inexpensive computation of information measures only on discrete variables; provides immunity to prior class probabilities; and brackets the probability of error of the classifier. The maximum output information (MOI) algorithms employ this function for feature subset selection by greedy elimination and directed search. The output of the MOI algorithms is a feature subset of user-defined size and an associated trained classifier (MLP/SVM). These algorithms compare favorably with a number of other methods in terms of performance on various artificial and real-world data sets.</description><identifier>ISSN: 1045-9227</identifier><identifier>ISSN: 2162-237X</identifier><identifier>EISSN: 1941-0093</identifier><identifier>EISSN: 2162-2388</identifier><identifier>DOI: 10.1109/TNN.2004.828772</identifier><identifier>PMID: 15461085</identifier><identifier>CODEN: ITNNEP</identifier><language>eng</language><publisher>United States: IEEE</publisher><subject>Algorithms ; Artificial Intelligence ; Breast Neoplasms - classification ; Breast Neoplasms - diagnosis ; Classifiers ; Cluster Analysis ; Computer science ; Computer Simulation ; Computing Methodologies ; Decision Support Techniques ; Error analysis ; Filters ; Humans ; Information Storage and Retrieval - methods ; Information Theory ; Likelihood Functions ; Mathematical analysis ; Mathematical models ; Models, Statistical ; Multilayer perceptrons ; Mutual information ; Neural networks ; Neural Networks (Computer) ; Partitioning algorithms ; Pattern Recognition, Automated ; Probability distribution ; Probability Learning ; Supervised learning ; Support vector machine classification ; Support vector machines</subject><ispartof>IEEE transaction on neural networks and learning systems, 2004-07, Vol.15 (4), p.937-948</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2004</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c510t-2b5fd9f51894dba85894047d35f9b064f8fc845182d0ac8321a6b8a046efffe13</citedby><cites>FETCH-LOGICAL-c510t-2b5fd9f51894dba85894047d35f9b064f8fc845182d0ac8321a6b8a046efffe13</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/1310365$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,780,784,27924,27925,54796</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/15461085$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Sindhwani, V.</creatorcontrib><creatorcontrib>Rakshit, S.</creatorcontrib><creatorcontrib>Deodhare, D.</creatorcontrib><creatorcontrib>Erdogmus, D.</creatorcontrib><creatorcontrib>Principe, J.C.</creatorcontrib><creatorcontrib>Niyogi, P.</creatorcontrib><title>Feature selection in MLPs and SVMs based on maximum output information</title><title>IEEE transaction on neural networks and learning systems</title><addtitle>TNN</addtitle><addtitle>IEEE Trans Neural Netw</addtitle><description>This paper presents feature selection algorithms for multilayer perceptrons (MLPs) and multiclass support vector machines (SVMs), using mutual information between class labels and classifier outputs, as an objective function. This objective function involves inexpensive computation of information measures only on discrete variables; provides immunity to prior class probabilities; and brackets the probability of error of the classifier. The maximum output information (MOI) algorithms employ this function for feature subset selection by greedy elimination and directed search. The output of the MOI algorithms is a feature subset of user-defined size and an associated trained classifier (MLP/SVM). These algorithms compare favorably with a number of other methods in terms of performance on various artificial and real-world data sets.</description><subject>Algorithms</subject><subject>Artificial Intelligence</subject><subject>Breast Neoplasms - classification</subject><subject>Breast Neoplasms - diagnosis</subject><subject>Classifiers</subject><subject>Cluster Analysis</subject><subject>Computer science</subject><subject>Computer Simulation</subject><subject>Computing Methodologies</subject><subject>Decision Support Techniques</subject><subject>Error analysis</subject><subject>Filters</subject><subject>Humans</subject><subject>Information Storage and Retrieval - methods</subject><subject>Information Theory</subject><subject>Likelihood Functions</subject><subject>Mathematical analysis</subject><subject>Mathematical models</subject><subject>Models, Statistical</subject><subject>Multilayer perceptrons</subject><subject>Mutual information</subject><subject>Neural networks</subject><subject>Neural Networks (Computer)</subject><subject>Partitioning algorithms</subject><subject>Pattern Recognition, Automated</subject><subject>Probability distribution</subject><subject>Probability Learning</subject><subject>Supervised learning</subject><subject>Support vector machine classification</subject><subject>Support vector machines</subject><issn>1045-9227</issn><issn>2162-237X</issn><issn>1941-0093</issn><issn>2162-2388</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2004</creationdate><recordtype>article</recordtype><recordid>eNqFkU1Lw0AQhhdRbK2ePQgSPOgp7exndo9SrAptFaxel02yCylNUrMJ6L93QwsFD3qagfeZGYYHoUsMY4xBTVbL5ZgAsLEkMknIERpixXAMoOhx6IHxWBGSDNCZ92sAzDiIUzTAnAkMkg_RbGZN2zU28nZjs7aoq6ioosX81UemyqO3j4WPUuNtHoWkNF9F2ZVR3bXbrg2gq5vS9EPn6MSZjbcX-zpC77OH1fQpnr88Pk_v53HGMbQxSbnLleNYKpanRvJQgSU55U6lIJiTLpMsxCQHk0lKsBGpNMCEdc5ZTEfobrd329SfnfWtLguf2c3GVLbuvFaAhSIgSSBv_ySFUJQRQv8FiaSYJbS_ffMLXNddU4V3dX9SUgUsQJMdlDW19411etsUpWm-NQbdK9NBme6V6Z2yMHG9X9ulpc0P_N5RAK52QGGtPcQUAxWc_gALqZfg</recordid><startdate>20040701</startdate><enddate>20040701</enddate><creator>Sindhwani, V.</creator><creator>Rakshit, S.</creator><creator>Deodhare, D.</creator><creator>Erdogmus, D.</creator><creator>Principe, J.C.</creator><creator>Niyogi, P.</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>RIA</scope><scope>RIE</scope><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7QF</scope><scope>7QO</scope><scope>7QP</scope><scope>7QQ</scope><scope>7QR</scope><scope>7SC</scope><scope>7SE</scope><scope>7SP</scope><scope>7SR</scope><scope>7TA</scope><scope>7TB</scope><scope>7TK</scope><scope>7U5</scope><scope>8BQ</scope><scope>8FD</scope><scope>F28</scope><scope>FR3</scope><scope>H8D</scope><scope>JG9</scope><scope>JQ2</scope><scope>KR7</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>P64</scope><scope>7X8</scope></search><sort><creationdate>20040701</creationdate><title>Feature selection in MLPs and SVMs based on maximum output information</title><author>Sindhwani, V. ; Rakshit, S. ; Deodhare, D. ; Erdogmus, D. ; Principe, J.C. ; Niyogi, P.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c510t-2b5fd9f51894dba85894047d35f9b064f8fc845182d0ac8321a6b8a046efffe13</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2004</creationdate><topic>Algorithms</topic><topic>Artificial Intelligence</topic><topic>Breast Neoplasms - classification</topic><topic>Breast Neoplasms - diagnosis</topic><topic>Classifiers</topic><topic>Cluster Analysis</topic><topic>Computer science</topic><topic>Computer Simulation</topic><topic>Computing Methodologies</topic><topic>Decision Support Techniques</topic><topic>Error analysis</topic><topic>Filters</topic><topic>Humans</topic><topic>Information Storage and Retrieval - methods</topic><topic>Information Theory</topic><topic>Likelihood Functions</topic><topic>Mathematical analysis</topic><topic>Mathematical models</topic><topic>Models, Statistical</topic><topic>Multilayer perceptrons</topic><topic>Mutual information</topic><topic>Neural networks</topic><topic>Neural Networks (Computer)</topic><topic>Partitioning algorithms</topic><topic>Pattern Recognition, Automated</topic><topic>Probability distribution</topic><topic>Probability Learning</topic><topic>Supervised learning</topic><topic>Support vector machine classification</topic><topic>Support vector machines</topic><toplevel>online_resources</toplevel><creatorcontrib>Sindhwani, V.</creatorcontrib><creatorcontrib>Rakshit, S.</creatorcontrib><creatorcontrib>Deodhare, D.</creatorcontrib><creatorcontrib>Erdogmus, D.</creatorcontrib><creatorcontrib>Principe, J.C.</creatorcontrib><creatorcontrib>Niyogi, P.</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Xplore</collection><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>Aluminium Industry Abstracts</collection><collection>Biotechnology Research Abstracts</collection><collection>Calcium & Calcified Tissue Abstracts</collection><collection>Ceramic Abstracts</collection><collection>Chemoreception Abstracts</collection><collection>Computer and Information Systems Abstracts</collection><collection>Corrosion Abstracts</collection><collection>Electronics & Communications Abstracts</collection><collection>Engineered Materials Abstracts</collection><collection>Materials Business File</collection><collection>Mechanical & Transportation Engineering Abstracts</collection><collection>Neurosciences Abstracts</collection><collection>Solid State and Superconductivity Abstracts</collection><collection>METADEX</collection><collection>Technology Research Database</collection><collection>ANTE: Abstracts in New Technology & Engineering</collection><collection>Engineering Research Database</collection><collection>Aerospace Database</collection><collection>Materials Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Civil Engineering Abstracts</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>Biotechnology and BioEngineering Abstracts</collection><collection>MEDLINE - Academic</collection><jtitle>IEEE transaction on neural networks and learning systems</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Sindhwani, V.</au><au>Rakshit, S.</au><au>Deodhare, D.</au><au>Erdogmus, D.</au><au>Principe, J.C.</au><au>Niyogi, P.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Feature selection in MLPs and SVMs based on maximum output information</atitle><jtitle>IEEE transaction on neural networks and learning systems</jtitle><stitle>TNN</stitle><addtitle>IEEE Trans Neural Netw</addtitle><date>2004-07-01</date><risdate>2004</risdate><volume>15</volume><issue>4</issue><spage>937</spage><epage>948</epage><pages>937-948</pages><issn>1045-9227</issn><issn>2162-237X</issn><eissn>1941-0093</eissn><eissn>2162-2388</eissn><coden>ITNNEP</coden><abstract>This paper presents feature selection algorithms for multilayer perceptrons (MLPs) and multiclass support vector machines (SVMs), using mutual information between class labels and classifier outputs, as an objective function. This objective function involves inexpensive computation of information measures only on discrete variables; provides immunity to prior class probabilities; and brackets the probability of error of the classifier. The maximum output information (MOI) algorithms employ this function for feature subset selection by greedy elimination and directed search. The output of the MOI algorithms is a feature subset of user-defined size and an associated trained classifier (MLP/SVM). These algorithms compare favorably with a number of other methods in terms of performance on various artificial and real-world data sets.</abstract><cop>United States</cop><pub>IEEE</pub><pmid>15461085</pmid><doi>10.1109/TNN.2004.828772</doi><tpages>12</tpages><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 1045-9227 |
ispartof | IEEE transaction on neural networks and learning systems, 2004-07, Vol.15 (4), p.937-948 |
issn | 1045-9227 2162-237X 1941-0093 2162-2388 |
language | eng |
recordid | cdi_crossref_primary_10_1109_TNN_2004_828772 |
source | IEEE Xplore (Online service) |
subjects | Algorithms Artificial Intelligence Breast Neoplasms - classification Breast Neoplasms - diagnosis Classifiers Cluster Analysis Computer science Computer Simulation Computing Methodologies Decision Support Techniques Error analysis Filters Humans Information Storage and Retrieval - methods Information Theory Likelihood Functions Mathematical analysis Mathematical models Models, Statistical Multilayer perceptrons Mutual information Neural networks Neural Networks (Computer) Partitioning algorithms Pattern Recognition, Automated Probability distribution Probability Learning Supervised learning Support vector machine classification Support vector machines |
title | Feature selection in MLPs and SVMs based on maximum output information |
url | http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-27T10%3A26%3A33IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Feature%20selection%20in%20MLPs%20and%20SVMs%20based%20on%20maximum%20output%20information&rft.jtitle=IEEE%20transaction%20on%20neural%20networks%20and%20learning%20systems&rft.au=Sindhwani,%20V.&rft.date=2004-07-01&rft.volume=15&rft.issue=4&rft.spage=937&rft.epage=948&rft.pages=937-948&rft.issn=1045-9227&rft.eissn=1941-0093&rft.coden=ITNNEP&rft_id=info:doi/10.1109/TNN.2004.828772&rft_dat=%3Cproquest_cross%3E66934223%3C/proquest_cross%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c510t-2b5fd9f51894dba85894047d35f9b064f8fc845182d0ac8321a6b8a046efffe13%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=920883904&rft_id=info:pmid/15461085&rft_ieee_id=1310365&rfr_iscdi=true |