Loading…

Augmentation Invariant and Instance Spreading Feature for Softmax Embedding

Deep embedding learning plays a key role in learning discriminative feature representations, where the visually similar samples are pulled closer and dissimilar samples are pushed away in the low-dimensional embedding space. This paper studies the unsupervised embedding learning problem by learning...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transactions on pattern analysis and machine intelligence 2022-02, Vol.44 (2), p.924-939
Main Authors: Ye, Mang, Shen, Jianbing, Zhang, Xu, Yuen, Pong C., Chang, Shih-Fu
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by cdi_FETCH-LOGICAL-c351t-9d39e775f33f3ad7b9e8b251900c428834c528df4f28d324376cd9475d3cd3493
cites cdi_FETCH-LOGICAL-c351t-9d39e775f33f3ad7b9e8b251900c428834c528df4f28d324376cd9475d3cd3493
container_end_page 939
container_issue 2
container_start_page 924
container_title IEEE transactions on pattern analysis and machine intelligence
container_volume 44
creator Ye, Mang
Shen, Jianbing
Zhang, Xu
Yuen, Pong C.
Chang, Shih-Fu
description Deep embedding learning plays a key role in learning discriminative feature representations, where the visually similar samples are pulled closer and dissimilar samples are pushed away in the low-dimensional embedding space. This paper studies the unsupervised embedding learning problem by learning such a representation without using any category labels. This task faces two primary challenges: mining reliable positive supervision from highly similar fine-grained classes, and generalizing to unseen testing categories. To approximate the positive concentration and negative separation properties in category-wise supervised learning, we introduce a data augmentation invariant and instance spreading feature using the instance-wise supervision. We also design two novel domain-agnostic augmentation strategies to further extend the supervision in feature space, which simulates the large batch training using a small batch size and the augmented features. To learn such a representation, we propose a novel instance-wise softmax embedding, which directly perform the optimization over the augmented instance features with the binary discrmination softmax encoding. It significantly accelerates the learning speed with much higher accuracy than existing methods, under both seen and unseen testing categories. The unsupervised embedding performs well even without pre-trained network over samples from fine-grained categories. We also develop a variant using category-wise supervision, namely category-wise softmax embedding, which achieves competitive performance over the state-of-of-the-arts, without using any auxiliary information or restrict sample mining.
doi_str_mv 10.1109/TPAMI.2020.3013379
format article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2617491720</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>9154587</ieee_id><sourcerecordid>2430662231</sourcerecordid><originalsourceid>FETCH-LOGICAL-c351t-9d39e775f33f3ad7b9e8b251900c428834c528df4f28d324376cd9475d3cd3493</originalsourceid><addsrcrecordid>eNpdkE1LxDAQhoMoun78AQUpePHSNckkTXJcFj8WFQX1HLLNVCrbdE1a0X9v1109eJlhmOcdhoeQY0bHjFFz8fw4uZ-NOeV0DJQBKLNFRsyAyUGC2SYjygqea831HtlP6Y1SJiSFXbIHXEmqBRuR20n_2mDoXFe3IZuFDxdrF7rMBT9MqXOhxOxpGdH5OrxmV-i6PmJWtTF7aquucZ_ZZTNHv9oekp3KLRIebfoBebm6fJ7e5HcP17Pp5C4vQbIuNx4MKiUrgAqcV3ODes4lM5SWgmsNopRc-0pUQwUuQBWlN0JJD6UHYeCAnK_vLmP73mPqbFOnEhcLF7Dtkx0itCg4BzagZ__Qt7aPYfjO8oIpYZjidKD4mipjm1LEyi5j3bj4ZRm1K9X2R7VdqbYb1UPodHO6nzfo_yK_bgfgZA3UiPi3NkwKqRV8AxGLgLg</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2617491720</pqid></control><display><type>article</type><title>Augmentation Invariant and Instance Spreading Feature for Softmax Embedding</title><source>IEEE Xplore (Online service)</source><creator>Ye, Mang ; Shen, Jianbing ; Zhang, Xu ; Yuen, Pong C. ; Chang, Shih-Fu</creator><creatorcontrib>Ye, Mang ; Shen, Jianbing ; Zhang, Xu ; Yuen, Pong C. ; Chang, Shih-Fu</creatorcontrib><description>Deep embedding learning plays a key role in learning discriminative feature representations, where the visually similar samples are pulled closer and dissimilar samples are pushed away in the low-dimensional embedding space. This paper studies the unsupervised embedding learning problem by learning such a representation without using any category labels. This task faces two primary challenges: mining reliable positive supervision from highly similar fine-grained classes, and generalizing to unseen testing categories. To approximate the positive concentration and negative separation properties in category-wise supervised learning, we introduce a data augmentation invariant and instance spreading feature using the instance-wise supervision. We also design two novel domain-agnostic augmentation strategies to further extend the supervision in feature space, which simulates the large batch training using a small batch size and the augmented features. To learn such a representation, we propose a novel instance-wise softmax embedding, which directly perform the optimization over the augmented instance features with the binary discrmination softmax encoding. It significantly accelerates the learning speed with much higher accuracy than existing methods, under both seen and unseen testing categories. The unsupervised embedding performs well even without pre-trained network over samples from fine-grained categories. We also develop a variant using category-wise supervision, namely category-wise softmax embedding, which achieves competitive performance over the state-of-of-the-arts, without using any auxiliary information or restrict sample mining.</description><identifier>ISSN: 0162-8828</identifier><identifier>EISSN: 1939-3539</identifier><identifier>EISSN: 2160-9292</identifier><identifier>DOI: 10.1109/TPAMI.2020.3013379</identifier><identifier>PMID: 32750841</identifier><identifier>CODEN: ITPIDJ</identifier><language>eng</language><publisher>United States: IEEE</publisher><subject>Algorithms ; Attention ; Categories ; Data augmentation ; Data mining ; Embedding ; embedding learning ; instance feature ; Invariants ; Optimization ; Representations ; softmax embedding ; Supervised learning ; Supervision ; Task analysis ; Testing ; Training ; Unsupervised learning ; Visualization</subject><ispartof>IEEE transactions on pattern analysis and machine intelligence, 2022-02, Vol.44 (2), p.924-939</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2022</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c351t-9d39e775f33f3ad7b9e8b251900c428834c528df4f28d324376cd9475d3cd3493</citedby><cites>FETCH-LOGICAL-c351t-9d39e775f33f3ad7b9e8b251900c428834c528df4f28d324376cd9475d3cd3493</cites><orcidid>0000-0003-1444-1205 ; 0000-0002-0834-4102 ; 0000-0003-3989-7655 ; 0000-0003-2656-3082 ; 0000-0002-9343-2202</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/9154587$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,777,781,27905,27906,54777</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/32750841$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Ye, Mang</creatorcontrib><creatorcontrib>Shen, Jianbing</creatorcontrib><creatorcontrib>Zhang, Xu</creatorcontrib><creatorcontrib>Yuen, Pong C.</creatorcontrib><creatorcontrib>Chang, Shih-Fu</creatorcontrib><title>Augmentation Invariant and Instance Spreading Feature for Softmax Embedding</title><title>IEEE transactions on pattern analysis and machine intelligence</title><addtitle>TPAMI</addtitle><addtitle>IEEE Trans Pattern Anal Mach Intell</addtitle><description>Deep embedding learning plays a key role in learning discriminative feature representations, where the visually similar samples are pulled closer and dissimilar samples are pushed away in the low-dimensional embedding space. This paper studies the unsupervised embedding learning problem by learning such a representation without using any category labels. This task faces two primary challenges: mining reliable positive supervision from highly similar fine-grained classes, and generalizing to unseen testing categories. To approximate the positive concentration and negative separation properties in category-wise supervised learning, we introduce a data augmentation invariant and instance spreading feature using the instance-wise supervision. We also design two novel domain-agnostic augmentation strategies to further extend the supervision in feature space, which simulates the large batch training using a small batch size and the augmented features. To learn such a representation, we propose a novel instance-wise softmax embedding, which directly perform the optimization over the augmented instance features with the binary discrmination softmax encoding. It significantly accelerates the learning speed with much higher accuracy than existing methods, under both seen and unseen testing categories. The unsupervised embedding performs well even without pre-trained network over samples from fine-grained categories. We also develop a variant using category-wise supervision, namely category-wise softmax embedding, which achieves competitive performance over the state-of-of-the-arts, without using any auxiliary information or restrict sample mining.</description><subject>Algorithms</subject><subject>Attention</subject><subject>Categories</subject><subject>Data augmentation</subject><subject>Data mining</subject><subject>Embedding</subject><subject>embedding learning</subject><subject>instance feature</subject><subject>Invariants</subject><subject>Optimization</subject><subject>Representations</subject><subject>softmax embedding</subject><subject>Supervised learning</subject><subject>Supervision</subject><subject>Task analysis</subject><subject>Testing</subject><subject>Training</subject><subject>Unsupervised learning</subject><subject>Visualization</subject><issn>0162-8828</issn><issn>1939-3539</issn><issn>2160-9292</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2022</creationdate><recordtype>article</recordtype><recordid>eNpdkE1LxDAQhoMoun78AQUpePHSNckkTXJcFj8WFQX1HLLNVCrbdE1a0X9v1109eJlhmOcdhoeQY0bHjFFz8fw4uZ-NOeV0DJQBKLNFRsyAyUGC2SYjygqea831HtlP6Y1SJiSFXbIHXEmqBRuR20n_2mDoXFe3IZuFDxdrF7rMBT9MqXOhxOxpGdH5OrxmV-i6PmJWtTF7aquucZ_ZZTNHv9oekp3KLRIebfoBebm6fJ7e5HcP17Pp5C4vQbIuNx4MKiUrgAqcV3ODes4lM5SWgmsNopRc-0pUQwUuQBWlN0JJD6UHYeCAnK_vLmP73mPqbFOnEhcLF7Dtkx0itCg4BzagZ__Qt7aPYfjO8oIpYZjidKD4mipjm1LEyi5j3bj4ZRm1K9X2R7VdqbYb1UPodHO6nzfo_yK_bgfgZA3UiPi3NkwKqRV8AxGLgLg</recordid><startdate>20220201</startdate><enddate>20220201</enddate><creator>Ye, Mang</creator><creator>Shen, Jianbing</creator><creator>Zhang, Xu</creator><creator>Yuen, Pong C.</creator><creator>Chang, Shih-Fu</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7SP</scope><scope>8FD</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>7X8</scope><orcidid>https://orcid.org/0000-0003-1444-1205</orcidid><orcidid>https://orcid.org/0000-0002-0834-4102</orcidid><orcidid>https://orcid.org/0000-0003-3989-7655</orcidid><orcidid>https://orcid.org/0000-0003-2656-3082</orcidid><orcidid>https://orcid.org/0000-0002-9343-2202</orcidid></search><sort><creationdate>20220201</creationdate><title>Augmentation Invariant and Instance Spreading Feature for Softmax Embedding</title><author>Ye, Mang ; Shen, Jianbing ; Zhang, Xu ; Yuen, Pong C. ; Chang, Shih-Fu</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c351t-9d39e775f33f3ad7b9e8b251900c428834c528df4f28d324376cd9475d3cd3493</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2022</creationdate><topic>Algorithms</topic><topic>Attention</topic><topic>Categories</topic><topic>Data augmentation</topic><topic>Data mining</topic><topic>Embedding</topic><topic>embedding learning</topic><topic>instance feature</topic><topic>Invariants</topic><topic>Optimization</topic><topic>Representations</topic><topic>softmax embedding</topic><topic>Supervised learning</topic><topic>Supervision</topic><topic>Task analysis</topic><topic>Testing</topic><topic>Training</topic><topic>Unsupervised learning</topic><topic>Visualization</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Ye, Mang</creatorcontrib><creatorcontrib>Shen, Jianbing</creatorcontrib><creatorcontrib>Zhang, Xu</creatorcontrib><creatorcontrib>Yuen, Pong C.</creatorcontrib><creatorcontrib>Chang, Shih-Fu</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE/IET Electronic Library</collection><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics &amp; Communications Abstracts</collection><collection>Technology Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>MEDLINE - Academic</collection><jtitle>IEEE transactions on pattern analysis and machine intelligence</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Ye, Mang</au><au>Shen, Jianbing</au><au>Zhang, Xu</au><au>Yuen, Pong C.</au><au>Chang, Shih-Fu</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Augmentation Invariant and Instance Spreading Feature for Softmax Embedding</atitle><jtitle>IEEE transactions on pattern analysis and machine intelligence</jtitle><stitle>TPAMI</stitle><addtitle>IEEE Trans Pattern Anal Mach Intell</addtitle><date>2022-02-01</date><risdate>2022</risdate><volume>44</volume><issue>2</issue><spage>924</spage><epage>939</epage><pages>924-939</pages><issn>0162-8828</issn><eissn>1939-3539</eissn><eissn>2160-9292</eissn><coden>ITPIDJ</coden><abstract>Deep embedding learning plays a key role in learning discriminative feature representations, where the visually similar samples are pulled closer and dissimilar samples are pushed away in the low-dimensional embedding space. This paper studies the unsupervised embedding learning problem by learning such a representation without using any category labels. This task faces two primary challenges: mining reliable positive supervision from highly similar fine-grained classes, and generalizing to unseen testing categories. To approximate the positive concentration and negative separation properties in category-wise supervised learning, we introduce a data augmentation invariant and instance spreading feature using the instance-wise supervision. We also design two novel domain-agnostic augmentation strategies to further extend the supervision in feature space, which simulates the large batch training using a small batch size and the augmented features. To learn such a representation, we propose a novel instance-wise softmax embedding, which directly perform the optimization over the augmented instance features with the binary discrmination softmax encoding. It significantly accelerates the learning speed with much higher accuracy than existing methods, under both seen and unseen testing categories. The unsupervised embedding performs well even without pre-trained network over samples from fine-grained categories. We also develop a variant using category-wise supervision, namely category-wise softmax embedding, which achieves competitive performance over the state-of-of-the-arts, without using any auxiliary information or restrict sample mining.</abstract><cop>United States</cop><pub>IEEE</pub><pmid>32750841</pmid><doi>10.1109/TPAMI.2020.3013379</doi><tpages>16</tpages><orcidid>https://orcid.org/0000-0003-1444-1205</orcidid><orcidid>https://orcid.org/0000-0002-0834-4102</orcidid><orcidid>https://orcid.org/0000-0003-3989-7655</orcidid><orcidid>https://orcid.org/0000-0003-2656-3082</orcidid><orcidid>https://orcid.org/0000-0002-9343-2202</orcidid></addata></record>
fulltext fulltext
identifier ISSN: 0162-8828
ispartof IEEE transactions on pattern analysis and machine intelligence, 2022-02, Vol.44 (2), p.924-939
issn 0162-8828
1939-3539
2160-9292
language eng
recordid cdi_proquest_journals_2617491720
source IEEE Xplore (Online service)
subjects Algorithms
Attention
Categories
Data augmentation
Data mining
Embedding
embedding learning
instance feature
Invariants
Optimization
Representations
softmax embedding
Supervised learning
Supervision
Task analysis
Testing
Training
Unsupervised learning
Visualization
title Augmentation Invariant and Instance Spreading Feature for Softmax Embedding
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-17T23%3A15%3A36IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Augmentation%20Invariant%20and%20Instance%20Spreading%20Feature%20for%20Softmax%20Embedding&rft.jtitle=IEEE%20transactions%20on%20pattern%20analysis%20and%20machine%20intelligence&rft.au=Ye,%20Mang&rft.date=2022-02-01&rft.volume=44&rft.issue=2&rft.spage=924&rft.epage=939&rft.pages=924-939&rft.issn=0162-8828&rft.eissn=1939-3539&rft.coden=ITPIDJ&rft_id=info:doi/10.1109/TPAMI.2020.3013379&rft_dat=%3Cproquest_cross%3E2430662231%3C/proquest_cross%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c351t-9d39e775f33f3ad7b9e8b251900c428834c528df4f28d324376cd9475d3cd3493%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=2617491720&rft_id=info:pmid/32750841&rft_ieee_id=9154587&rfr_iscdi=true