Loading…
Constrained classifier: a novel approach to nonlinear classification
Simple classifiers have the advantage of more generalization capability with the side effect of less power. It would be a good idea if we could build a classifier which is as simple as possible while giving it the ability of classifying complex patterns. In this paper, a hybrid classifier called “co...
Saved in:
Published in: | Neural computing & applications 2013-12, Vol.23 (7-8), p.2367-2377 |
---|---|
Main Authors: | , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
cited_by | |
---|---|
cites | cdi_FETCH-LOGICAL-c270t-e743b2bd2ef0b6fb231fdbfb74bdfd735a03150d3463a9626d8d9cbb79da28993 |
container_end_page | 2377 |
container_issue | 7-8 |
container_start_page | 2367 |
container_title | Neural computing & applications |
container_volume | 23 |
creator | Abbassi, H. Monsefi, R. Sadoghi Yazdi, H. |
description | Simple classifiers have the advantage of more generalization capability with the side effect of less power. It would be a good idea if we could build a classifier which is as simple as possible while giving it the ability of classifying complex patterns. In this paper, a hybrid classifier called “constrained classifier” is presented that classifies most of the input patterns using a simple, for example, a linear classifier. It performs the classification in four steps. In the “Dividing” step, the input patterns are divided into linearly separable and nonlinearly separable groups. The patterns belonging to the first group are classified using a simple classifier while the second group patterns (named “constraints”) are modeled in the “Modeling” step. The results of previous steps are merged together in the “Combining” step. The “Evaluation” step tests and fine tunes the membership of patterns into two groups. The experimental results of comparison of the new classifier with famous classifiers such as “support vector machine”, k-NN, and “Classification and Regression Trees” are very encouraging. |
doi_str_mv | 10.1007/s00521-012-1194-9 |
format | article |
fullrecord | <record><control><sourceid>pascalfrancis_cross</sourceid><recordid>TN_cdi_crossref_primary_10_1007_s00521_012_1194_9</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>27907385</sourcerecordid><originalsourceid>FETCH-LOGICAL-c270t-e743b2bd2ef0b6fb231fdbfb74bdfd735a03150d3463a9626d8d9cbb79da28993</originalsourceid><addsrcrecordid>eNp9kMtKxEAQRRtRcHx8gLtsXEarH0mn3cn4hAE3um6qX5ohdofuKPj3ZojM0lVB1bkX6hByQeGKAsjrAtAwWgNlNaVK1OqArKjgvObQdIdkBUrM11bwY3JSyhYARNs1K3K3TrFMGfvoXWUHLKUPvc83FVYxffuhwnHMCe1HNaV5E4cZxLwnLU59imfkKOBQ_PnfPCVvD_ev66d68_L4vL7d1JZJmGovBTfMOOYDmDYYxmlwJhgpjAtO8gaB0wYcFy1H1bLWdU5ZY6RyyDql-CmhS6_NqZTsgx5z_4n5R1PQOw160aBnDXqnQe8yl0tmxGJxCBmj7cs-yKQCybtm5tjClfkU333W2_SV4_zOP-W_fSZt-A</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>Constrained classifier: a novel approach to nonlinear classification</title><source>Springer Link</source><creator>Abbassi, H. ; Monsefi, R. ; Sadoghi Yazdi, H.</creator><creatorcontrib>Abbassi, H. ; Monsefi, R. ; Sadoghi Yazdi, H.</creatorcontrib><description>Simple classifiers have the advantage of more generalization capability with the side effect of less power. It would be a good idea if we could build a classifier which is as simple as possible while giving it the ability of classifying complex patterns. In this paper, a hybrid classifier called “constrained classifier” is presented that classifies most of the input patterns using a simple, for example, a linear classifier. It performs the classification in four steps. In the “Dividing” step, the input patterns are divided into linearly separable and nonlinearly separable groups. The patterns belonging to the first group are classified using a simple classifier while the second group patterns (named “constraints”) are modeled in the “Modeling” step. The results of previous steps are merged together in the “Combining” step. The “Evaluation” step tests and fine tunes the membership of patterns into two groups. The experimental results of comparison of the new classifier with famous classifiers such as “support vector machine”, k-NN, and “Classification and Regression Trees” are very encouraging.</description><identifier>ISSN: 0941-0643</identifier><identifier>EISSN: 1433-3058</identifier><identifier>DOI: 10.1007/s00521-012-1194-9</identifier><language>eng</language><publisher>London: Springer London</publisher><subject>Applied sciences ; Artificial Intelligence ; Computational Biology/Bioinformatics ; Computational Science and Engineering ; Computer Science ; Computer science; control theory; systems ; Data Mining and Knowledge Discovery ; Data processing. List processing. Character string processing ; Exact sciences and technology ; Image Processing and Computer Vision ; Memory organisation. Data processing ; Original Article ; Probability and Statistics in Computer Science ; Software</subject><ispartof>Neural computing & applications, 2013-12, Vol.23 (7-8), p.2367-2377</ispartof><rights>Springer-Verlag London 2012</rights><rights>2014 INIST-CNRS</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><cites>FETCH-LOGICAL-c270t-e743b2bd2ef0b6fb231fdbfb74bdfd735a03150d3463a9626d8d9cbb79da28993</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,780,784,27924,27925</link.rule.ids><backlink>$$Uhttp://pascal-francis.inist.fr/vibad/index.php?action=getRecordDetail&idt=27907385$$DView record in Pascal Francis$$Hfree_for_read</backlink></links><search><creatorcontrib>Abbassi, H.</creatorcontrib><creatorcontrib>Monsefi, R.</creatorcontrib><creatorcontrib>Sadoghi Yazdi, H.</creatorcontrib><title>Constrained classifier: a novel approach to nonlinear classification</title><title>Neural computing & applications</title><addtitle>Neural Comput & Applic</addtitle><description>Simple classifiers have the advantage of more generalization capability with the side effect of less power. It would be a good idea if we could build a classifier which is as simple as possible while giving it the ability of classifying complex patterns. In this paper, a hybrid classifier called “constrained classifier” is presented that classifies most of the input patterns using a simple, for example, a linear classifier. It performs the classification in four steps. In the “Dividing” step, the input patterns are divided into linearly separable and nonlinearly separable groups. The patterns belonging to the first group are classified using a simple classifier while the second group patterns (named “constraints”) are modeled in the “Modeling” step. The results of previous steps are merged together in the “Combining” step. The “Evaluation” step tests and fine tunes the membership of patterns into two groups. The experimental results of comparison of the new classifier with famous classifiers such as “support vector machine”, k-NN, and “Classification and Regression Trees” are very encouraging.</description><subject>Applied sciences</subject><subject>Artificial Intelligence</subject><subject>Computational Biology/Bioinformatics</subject><subject>Computational Science and Engineering</subject><subject>Computer Science</subject><subject>Computer science; control theory; systems</subject><subject>Data Mining and Knowledge Discovery</subject><subject>Data processing. List processing. Character string processing</subject><subject>Exact sciences and technology</subject><subject>Image Processing and Computer Vision</subject><subject>Memory organisation. Data processing</subject><subject>Original Article</subject><subject>Probability and Statistics in Computer Science</subject><subject>Software</subject><issn>0941-0643</issn><issn>1433-3058</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2013</creationdate><recordtype>article</recordtype><recordid>eNp9kMtKxEAQRRtRcHx8gLtsXEarH0mn3cn4hAE3um6qX5ohdofuKPj3ZojM0lVB1bkX6hByQeGKAsjrAtAwWgNlNaVK1OqArKjgvObQdIdkBUrM11bwY3JSyhYARNs1K3K3TrFMGfvoXWUHLKUPvc83FVYxffuhwnHMCe1HNaV5E4cZxLwnLU59imfkKOBQ_PnfPCVvD_ev66d68_L4vL7d1JZJmGovBTfMOOYDmDYYxmlwJhgpjAtO8gaB0wYcFy1H1bLWdU5ZY6RyyDql-CmhS6_NqZTsgx5z_4n5R1PQOw160aBnDXqnQe8yl0tmxGJxCBmj7cs-yKQCybtm5tjClfkU333W2_SV4_zOP-W_fSZt-A</recordid><startdate>20131201</startdate><enddate>20131201</enddate><creator>Abbassi, H.</creator><creator>Monsefi, R.</creator><creator>Sadoghi Yazdi, H.</creator><general>Springer London</general><general>Springer</general><scope>IQODW</scope><scope>AAYXX</scope><scope>CITATION</scope></search><sort><creationdate>20131201</creationdate><title>Constrained classifier: a novel approach to nonlinear classification</title><author>Abbassi, H. ; Monsefi, R. ; Sadoghi Yazdi, H.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c270t-e743b2bd2ef0b6fb231fdbfb74bdfd735a03150d3463a9626d8d9cbb79da28993</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2013</creationdate><topic>Applied sciences</topic><topic>Artificial Intelligence</topic><topic>Computational Biology/Bioinformatics</topic><topic>Computational Science and Engineering</topic><topic>Computer Science</topic><topic>Computer science; control theory; systems</topic><topic>Data Mining and Knowledge Discovery</topic><topic>Data processing. List processing. Character string processing</topic><topic>Exact sciences and technology</topic><topic>Image Processing and Computer Vision</topic><topic>Memory organisation. Data processing</topic><topic>Original Article</topic><topic>Probability and Statistics in Computer Science</topic><topic>Software</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Abbassi, H.</creatorcontrib><creatorcontrib>Monsefi, R.</creatorcontrib><creatorcontrib>Sadoghi Yazdi, H.</creatorcontrib><collection>Pascal-Francis</collection><collection>CrossRef</collection><jtitle>Neural computing & applications</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Abbassi, H.</au><au>Monsefi, R.</au><au>Sadoghi Yazdi, H.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Constrained classifier: a novel approach to nonlinear classification</atitle><jtitle>Neural computing & applications</jtitle><stitle>Neural Comput & Applic</stitle><date>2013-12-01</date><risdate>2013</risdate><volume>23</volume><issue>7-8</issue><spage>2367</spage><epage>2377</epage><pages>2367-2377</pages><issn>0941-0643</issn><eissn>1433-3058</eissn><abstract>Simple classifiers have the advantage of more generalization capability with the side effect of less power. It would be a good idea if we could build a classifier which is as simple as possible while giving it the ability of classifying complex patterns. In this paper, a hybrid classifier called “constrained classifier” is presented that classifies most of the input patterns using a simple, for example, a linear classifier. It performs the classification in four steps. In the “Dividing” step, the input patterns are divided into linearly separable and nonlinearly separable groups. The patterns belonging to the first group are classified using a simple classifier while the second group patterns (named “constraints”) are modeled in the “Modeling” step. The results of previous steps are merged together in the “Combining” step. The “Evaluation” step tests and fine tunes the membership of patterns into two groups. The experimental results of comparison of the new classifier with famous classifiers such as “support vector machine”, k-NN, and “Classification and Regression Trees” are very encouraging.</abstract><cop>London</cop><pub>Springer London</pub><doi>10.1007/s00521-012-1194-9</doi><tpages>11</tpages></addata></record> |
fulltext | fulltext |
identifier | ISSN: 0941-0643 |
ispartof | Neural computing & applications, 2013-12, Vol.23 (7-8), p.2367-2377 |
issn | 0941-0643 1433-3058 |
language | eng |
recordid | cdi_crossref_primary_10_1007_s00521_012_1194_9 |
source | Springer Link |
subjects | Applied sciences Artificial Intelligence Computational Biology/Bioinformatics Computational Science and Engineering Computer Science Computer science control theory systems Data Mining and Knowledge Discovery Data processing. List processing. Character string processing Exact sciences and technology Image Processing and Computer Vision Memory organisation. Data processing Original Article Probability and Statistics in Computer Science Software |
title | Constrained classifier: a novel approach to nonlinear classification |
url | http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-06T21%3A16%3A46IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-pascalfrancis_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Constrained%20classifier:%20a%20novel%20approach%20to%20nonlinear%20classification&rft.jtitle=Neural%20computing%20&%20applications&rft.au=Abbassi,%20H.&rft.date=2013-12-01&rft.volume=23&rft.issue=7-8&rft.spage=2367&rft.epage=2377&rft.pages=2367-2377&rft.issn=0941-0643&rft.eissn=1433-3058&rft_id=info:doi/10.1007/s00521-012-1194-9&rft_dat=%3Cpascalfrancis_cross%3E27907385%3C/pascalfrancis_cross%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c270t-e743b2bd2ef0b6fb231fdbfb74bdfd735a03150d3463a9626d8d9cbb79da28993%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true |