Loading…

Rectified Softmax Loss With All-Sided Cost Sensitivity for Age Estimation

In Convolutional Neural Network (ConvNet) based age estimation algorithms, softmax loss is usually chosen as the loss function directly, and the problems of Cost Sensitivity (CS), such as class imbalance and misclassification cost difference between different classes, are not considered. Focus on th...

Full description

Saved in:
Bibliographic Details
Published in:IEEE access 2020, Vol.8, p.32551-32563
Main Authors: Li, Daxiang, Ma, Xuan, Ren, Yaqiong, Teng, Shyh-Wei
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by cdi_FETCH-LOGICAL-c408t-b0b09daffb0b5be21ece5711a33d1f4bd168ecf4dff76c2e40e407efbaf088b83
cites cdi_FETCH-LOGICAL-c408t-b0b09daffb0b5be21ece5711a33d1f4bd168ecf4dff76c2e40e407efbaf088b83
container_end_page 32563
container_issue
container_start_page 32551
container_title IEEE access
container_volume 8
creator Li, Daxiang
Ma, Xuan
Ren, Yaqiong
Teng, Shyh-Wei
description In Convolutional Neural Network (ConvNet) based age estimation algorithms, softmax loss is usually chosen as the loss function directly, and the problems of Cost Sensitivity (CS), such as class imbalance and misclassification cost difference between different classes, are not considered. Focus on these problems, this paper constructs a rectified softmax loss function with all-sided CS, and proposes a novel cost-sensitive ConvNet based age estimation algorithm. Firstly, a loss function is established for each age category to solve the imbalance of the number of training samples. Then, a cost matrix is defined to reflect the cost difference caused by misclassification between different classes, thus constructing a new cost-sensitive error function. Finally, the above methods are merged to construct a rectified softmax loss function for ConvNet model, and a corresponding Back Propagation (BP) training scheme is designed to enable ConvNet network to learn robust face representation for age estimation during the training phase. Simultaneously, the rectified softmax loss is theoretically proved that it satisfies the general conditions of the loss function used for classification. The effectiveness of the proposed method is verified by experiments on face image datasets of different races.
doi_str_mv 10.1109/ACCESS.2020.2964281
format article
fullrecord <record><control><sourceid>proquest_ieee_</sourceid><recordid>TN_cdi_ieee_primary_8950353</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>8950353</ieee_id><doaj_id>oai_doaj_org_article_249740c820ac478c959c1983be1c3b0f</doaj_id><sourcerecordid>2454725465</sourcerecordid><originalsourceid>FETCH-LOGICAL-c408t-b0b09daffb0b5be21ece5711a33d1f4bd168ecf4dff76c2e40e407efbaf088b83</originalsourceid><addsrcrecordid>eNpNkd1KAzEQhRdRUNQn6M2C11vzu5tclqVqoSC4ipchm53UlNrUJIp9e6NbiiGQYWbOmTBfUUwwmmKM5O2sbeddNyWIoCmRNSMCnxQXBNeyopzWp__i8-I6xjXKR-QUby6KxROY5KyDoey8Te_6u1z6GMtXl97K2WZTdW7ItdbHVHawjS65L5f2pfWhnK2gnMfk3nVyfntVnFm9iXB9eC-Ll7v5c_tQLR_vF-1sWRmGRKp61CM5aGtzwHsgGAzwBmNN6YAt6wdcCzCWDdY2tSHAUL4N2F5bJEQv6GWxGH0Hr9dqF_L4sFdeO_WX8GGldEjObEARJhuGjCBIG9YII7k0WAraAza0RzZ73Yxeu-A_PiEmtfafYZu_n7WcNYSzmucuOnaZkFcTwB6nYqR-EagRgfpFoA4IsmoyqhwAHBVCcpRJ0B-Z9oHp</addsrcrecordid><sourcetype>Open Website</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2454725465</pqid></control><display><type>article</type><title>Rectified Softmax Loss With All-Sided Cost Sensitivity for Age Estimation</title><source>IEEE Xplore Open Access Journals</source><creator>Li, Daxiang ; Ma, Xuan ; Ren, Yaqiong ; Teng, Shyh-Wei</creator><creatorcontrib>Li, Daxiang ; Ma, Xuan ; Ren, Yaqiong ; Teng, Shyh-Wei</creatorcontrib><description>In Convolutional Neural Network (ConvNet) based age estimation algorithms, softmax loss is usually chosen as the loss function directly, and the problems of Cost Sensitivity (CS), such as class imbalance and misclassification cost difference between different classes, are not considered. Focus on these problems, this paper constructs a rectified softmax loss function with all-sided CS, and proposes a novel cost-sensitive ConvNet based age estimation algorithm. Firstly, a loss function is established for each age category to solve the imbalance of the number of training samples. Then, a cost matrix is defined to reflect the cost difference caused by misclassification between different classes, thus constructing a new cost-sensitive error function. Finally, the above methods are merged to construct a rectified softmax loss function for ConvNet model, and a corresponding Back Propagation (BP) training scheme is designed to enable ConvNet network to learn robust face representation for age estimation during the training phase. Simultaneously, the rectified softmax loss is theoretically proved that it satisfies the general conditions of the loss function used for classification. The effectiveness of the proposed method is verified by experiments on face image datasets of different races.</description><identifier>ISSN: 2169-3536</identifier><identifier>EISSN: 2169-3536</identifier><identifier>DOI: 10.1109/ACCESS.2020.2964281</identifier><identifier>CODEN: IAECCG</identifier><language>eng</language><publisher>Piscataway: IEEE</publisher><subject>Active appearance model ; Age ; Age estimation ; Aging ; Algorithms ; Artificial neural networks ; Back propagation networks ; Chronology ; cost sensitivity ; Error functions ; Estimation ; Face ; Sensitivity ; softmax loss ; Training</subject><ispartof>IEEE access, 2020, Vol.8, p.32551-32563</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2020</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c408t-b0b09daffb0b5be21ece5711a33d1f4bd168ecf4dff76c2e40e407efbaf088b83</citedby><cites>FETCH-LOGICAL-c408t-b0b09daffb0b5be21ece5711a33d1f4bd168ecf4dff76c2e40e407efbaf088b83</cites><orcidid>0000-0002-5766-5973 ; 0000-0001-7592-2139 ; 0000-0002-8811-6467 ; 0000-0002-7938-083X</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/8950353$$EHTML$$P50$$Gieee$$Hfree_for_read</linktohtml><link.rule.ids>314,776,780,4010,27610,27900,27901,27902,54908</link.rule.ids></links><search><creatorcontrib>Li, Daxiang</creatorcontrib><creatorcontrib>Ma, Xuan</creatorcontrib><creatorcontrib>Ren, Yaqiong</creatorcontrib><creatorcontrib>Teng, Shyh-Wei</creatorcontrib><title>Rectified Softmax Loss With All-Sided Cost Sensitivity for Age Estimation</title><title>IEEE access</title><addtitle>Access</addtitle><description>In Convolutional Neural Network (ConvNet) based age estimation algorithms, softmax loss is usually chosen as the loss function directly, and the problems of Cost Sensitivity (CS), such as class imbalance and misclassification cost difference between different classes, are not considered. Focus on these problems, this paper constructs a rectified softmax loss function with all-sided CS, and proposes a novel cost-sensitive ConvNet based age estimation algorithm. Firstly, a loss function is established for each age category to solve the imbalance of the number of training samples. Then, a cost matrix is defined to reflect the cost difference caused by misclassification between different classes, thus constructing a new cost-sensitive error function. Finally, the above methods are merged to construct a rectified softmax loss function for ConvNet model, and a corresponding Back Propagation (BP) training scheme is designed to enable ConvNet network to learn robust face representation for age estimation during the training phase. Simultaneously, the rectified softmax loss is theoretically proved that it satisfies the general conditions of the loss function used for classification. The effectiveness of the proposed method is verified by experiments on face image datasets of different races.</description><subject>Active appearance model</subject><subject>Age</subject><subject>Age estimation</subject><subject>Aging</subject><subject>Algorithms</subject><subject>Artificial neural networks</subject><subject>Back propagation networks</subject><subject>Chronology</subject><subject>cost sensitivity</subject><subject>Error functions</subject><subject>Estimation</subject><subject>Face</subject><subject>Sensitivity</subject><subject>softmax loss</subject><subject>Training</subject><issn>2169-3536</issn><issn>2169-3536</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2020</creationdate><recordtype>article</recordtype><sourceid>ESBDL</sourceid><sourceid>DOA</sourceid><recordid>eNpNkd1KAzEQhRdRUNQn6M2C11vzu5tclqVqoSC4ipchm53UlNrUJIp9e6NbiiGQYWbOmTBfUUwwmmKM5O2sbeddNyWIoCmRNSMCnxQXBNeyopzWp__i8-I6xjXKR-QUby6KxROY5KyDoey8Te_6u1z6GMtXl97K2WZTdW7ItdbHVHawjS65L5f2pfWhnK2gnMfk3nVyfntVnFm9iXB9eC-Ll7v5c_tQLR_vF-1sWRmGRKp61CM5aGtzwHsgGAzwBmNN6YAt6wdcCzCWDdY2tSHAUL4N2F5bJEQv6GWxGH0Hr9dqF_L4sFdeO_WX8GGldEjObEARJhuGjCBIG9YII7k0WAraAza0RzZ73Yxeu-A_PiEmtfafYZu_n7WcNYSzmucuOnaZkFcTwB6nYqR-EagRgfpFoA4IsmoyqhwAHBVCcpRJ0B-Z9oHp</recordid><startdate>2020</startdate><enddate>2020</enddate><creator>Li, Daxiang</creator><creator>Ma, Xuan</creator><creator>Ren, Yaqiong</creator><creator>Teng, Shyh-Wei</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>ESBDL</scope><scope>RIA</scope><scope>RIE</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7SP</scope><scope>7SR</scope><scope>8BQ</scope><scope>8FD</scope><scope>JG9</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>DOA</scope><orcidid>https://orcid.org/0000-0002-5766-5973</orcidid><orcidid>https://orcid.org/0000-0001-7592-2139</orcidid><orcidid>https://orcid.org/0000-0002-8811-6467</orcidid><orcidid>https://orcid.org/0000-0002-7938-083X</orcidid></search><sort><creationdate>2020</creationdate><title>Rectified Softmax Loss With All-Sided Cost Sensitivity for Age Estimation</title><author>Li, Daxiang ; Ma, Xuan ; Ren, Yaqiong ; Teng, Shyh-Wei</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c408t-b0b09daffb0b5be21ece5711a33d1f4bd168ecf4dff76c2e40e407efbaf088b83</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2020</creationdate><topic>Active appearance model</topic><topic>Age</topic><topic>Age estimation</topic><topic>Aging</topic><topic>Algorithms</topic><topic>Artificial neural networks</topic><topic>Back propagation networks</topic><topic>Chronology</topic><topic>cost sensitivity</topic><topic>Error functions</topic><topic>Estimation</topic><topic>Face</topic><topic>Sensitivity</topic><topic>softmax loss</topic><topic>Training</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Li, Daxiang</creatorcontrib><creatorcontrib>Ma, Xuan</creatorcontrib><creatorcontrib>Ren, Yaqiong</creatorcontrib><creatorcontrib>Teng, Shyh-Wei</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE Xplore Open Access Journals</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998–Present</collection><collection>IEEE Xplore</collection><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics &amp; Communications Abstracts</collection><collection>Engineered Materials Abstracts</collection><collection>METADEX</collection><collection>Technology Research Database</collection><collection>Materials Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>Directory of Open Access Journals</collection><jtitle>IEEE access</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Li, Daxiang</au><au>Ma, Xuan</au><au>Ren, Yaqiong</au><au>Teng, Shyh-Wei</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Rectified Softmax Loss With All-Sided Cost Sensitivity for Age Estimation</atitle><jtitle>IEEE access</jtitle><stitle>Access</stitle><date>2020</date><risdate>2020</risdate><volume>8</volume><spage>32551</spage><epage>32563</epage><pages>32551-32563</pages><issn>2169-3536</issn><eissn>2169-3536</eissn><coden>IAECCG</coden><abstract>In Convolutional Neural Network (ConvNet) based age estimation algorithms, softmax loss is usually chosen as the loss function directly, and the problems of Cost Sensitivity (CS), such as class imbalance and misclassification cost difference between different classes, are not considered. Focus on these problems, this paper constructs a rectified softmax loss function with all-sided CS, and proposes a novel cost-sensitive ConvNet based age estimation algorithm. Firstly, a loss function is established for each age category to solve the imbalance of the number of training samples. Then, a cost matrix is defined to reflect the cost difference caused by misclassification between different classes, thus constructing a new cost-sensitive error function. Finally, the above methods are merged to construct a rectified softmax loss function for ConvNet model, and a corresponding Back Propagation (BP) training scheme is designed to enable ConvNet network to learn robust face representation for age estimation during the training phase. Simultaneously, the rectified softmax loss is theoretically proved that it satisfies the general conditions of the loss function used for classification. The effectiveness of the proposed method is verified by experiments on face image datasets of different races.</abstract><cop>Piscataway</cop><pub>IEEE</pub><doi>10.1109/ACCESS.2020.2964281</doi><tpages>13</tpages><orcidid>https://orcid.org/0000-0002-5766-5973</orcidid><orcidid>https://orcid.org/0000-0001-7592-2139</orcidid><orcidid>https://orcid.org/0000-0002-8811-6467</orcidid><orcidid>https://orcid.org/0000-0002-7938-083X</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 2169-3536
ispartof IEEE access, 2020, Vol.8, p.32551-32563
issn 2169-3536
2169-3536
language eng
recordid cdi_ieee_primary_8950353
source IEEE Xplore Open Access Journals
subjects Active appearance model
Age
Age estimation
Aging
Algorithms
Artificial neural networks
Back propagation networks
Chronology
cost sensitivity
Error functions
Estimation
Face
Sensitivity
softmax loss
Training
title Rectified Softmax Loss With All-Sided Cost Sensitivity for Age Estimation
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-07T16%3A00%3A44IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_ieee_&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Rectified%20Softmax%20Loss%20With%20All-Sided%20Cost%20Sensitivity%20for%20Age%20Estimation&rft.jtitle=IEEE%20access&rft.au=Li,%20Daxiang&rft.date=2020&rft.volume=8&rft.spage=32551&rft.epage=32563&rft.pages=32551-32563&rft.issn=2169-3536&rft.eissn=2169-3536&rft.coden=IAECCG&rft_id=info:doi/10.1109/ACCESS.2020.2964281&rft_dat=%3Cproquest_ieee_%3E2454725465%3C/proquest_ieee_%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c408t-b0b09daffb0b5be21ece5711a33d1f4bd168ecf4dff76c2e40e407efbaf088b83%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=2454725465&rft_id=info:pmid/&rft_ieee_id=8950353&rfr_iscdi=true