Loading…
Learning from Few Samples with Memory Network
Neural networks (NN) have achieved great successes in pattern recognition and machine learning. However, the success of a NN usually relies on the provision of a sufficiently large number of data samples as training data. When fed with a limited data set, a NN’s performance may be degraded significa...
Saved in:
Published in: | Cognitive computation 2018-02, Vol.10 (1), p.15-22 |
---|---|
Main Authors: | , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
cited_by | cdi_FETCH-LOGICAL-c359t-ced465349a670faeadb42a2df5cdb45aa8046dacfb9385ff52bd0b1d6c0af9613 |
---|---|
cites | cdi_FETCH-LOGICAL-c359t-ced465349a670faeadb42a2df5cdb45aa8046dacfb9385ff52bd0b1d6c0af9613 |
container_end_page | 22 |
container_issue | 1 |
container_start_page | 15 |
container_title | Cognitive computation |
container_volume | 10 |
creator | Zhang, Shufei Huang, Kaizhu Zhang, Rui Hussain, Amir |
description | Neural networks (NN) have achieved great successes in pattern recognition and machine learning. However, the success of a NN usually relies on the provision of a sufficiently large number of data samples as training data. When fed with a limited data set, a NN’s performance may be degraded significantly. In this paper, a novel NN structure is proposed called a memory network. It is inspired by the cognitive mechanism of human beings, which can learn effectively, even from limited data. Taking advantage of the memory from previous samples, the new model achieves a remarkable improvement in performance when trained using limited data. The memory network is demonstrated here using the multi-layer perceptron (MLP) as a base model. However, it would be straightforward to extend the idea to other neural networks, e.g., convolutional neural networks (CNN). In this paper, the memory network structure is detailed, the training algorithm is presented, and a series of experiments are conducted to validate the proposed framework. Experimental results show that the proposed model outperforms traditional MLP-based models as well as other competitive algorithms in response to two real benchmark data sets. |
doi_str_mv | 10.1007/s12559-017-9507-z |
format | article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2919497416</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2919497416</sourcerecordid><originalsourceid>FETCH-LOGICAL-c359t-ced465349a670faeadb42a2df5cdb45aa8046dacfb9385ff52bd0b1d6c0af9613</originalsourceid><addsrcrecordid>eNp1kEFPwyAYhonRxDn9Ad5IPKNAgZajWZyaVD2oZ0ILzM61VOjSbL9elho9efrew_u8X_IAcEnwNcE4v4mEci4RJjmSHOdofwRmpBACSSnY8W_m4hScxbjGWHDJ6Qyg0urQNd0KuuBbuLQjfNVtv7ERjs3wAZ9s68MOPtth9OHzHJw4vYn24ufOwfvy7m3xgMqX-8fFbYnqjMsB1dYwwTMmtcix01abilFNjeN1SlzrAjNhdO0qmRXcOU4rgytiRI21k4Jkc3A17fbBf21tHNTab0OXXioqiWQyZ0SkFpladfAxButUH5pWh50iWB2sqMmKSlbUwYraJ4ZOTEzdbmXD3_L_0DfGAWWL</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2919497416</pqid></control><display><type>article</type><title>Learning from Few Samples with Memory Network</title><source>Springer Nature</source><creator>Zhang, Shufei ; Huang, Kaizhu ; Zhang, Rui ; Hussain, Amir</creator><creatorcontrib>Zhang, Shufei ; Huang, Kaizhu ; Zhang, Rui ; Hussain, Amir</creatorcontrib><description>Neural networks (NN) have achieved great successes in pattern recognition and machine learning. However, the success of a NN usually relies on the provision of a sufficiently large number of data samples as training data. When fed with a limited data set, a NN’s performance may be degraded significantly. In this paper, a novel NN structure is proposed called a memory network. It is inspired by the cognitive mechanism of human beings, which can learn effectively, even from limited data. Taking advantage of the memory from previous samples, the new model achieves a remarkable improvement in performance when trained using limited data. The memory network is demonstrated here using the multi-layer perceptron (MLP) as a base model. However, it would be straightforward to extend the idea to other neural networks, e.g., convolutional neural networks (CNN). In this paper, the memory network structure is detailed, the training algorithm is presented, and a series of experiments are conducted to validate the proposed framework. Experimental results show that the proposed model outperforms traditional MLP-based models as well as other competitive algorithms in response to two real benchmark data sets.</description><identifier>ISSN: 1866-9956</identifier><identifier>EISSN: 1866-9964</identifier><identifier>DOI: 10.1007/s12559-017-9507-z</identifier><language>eng</language><publisher>New York: Springer US</publisher><subject>Algorithms ; Artificial Intelligence ; Artificial neural networks ; Biomedical and Life Sciences ; Biomedicine ; Computation by Abstract Devices ; Computational Biology/Bioinformatics ; Datasets ; Deep learning ; Knowledge ; Language ; Machine learning ; Memory ; Multilayer perceptrons ; Multilayers ; Neural networks ; Neurosciences ; Optimization ; Pattern recognition ; Performance degradation</subject><ispartof>Cognitive computation, 2018-02, Vol.10 (1), p.15-22</ispartof><rights>Springer Science+Business Media, LLC 2017</rights><rights>Springer Science+Business Media, LLC 2017.</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c359t-ced465349a670faeadb42a2df5cdb45aa8046dacfb9385ff52bd0b1d6c0af9613</citedby><cites>FETCH-LOGICAL-c359t-ced465349a670faeadb42a2df5cdb45aa8046dacfb9385ff52bd0b1d6c0af9613</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,777,781,27905,27906</link.rule.ids></links><search><creatorcontrib>Zhang, Shufei</creatorcontrib><creatorcontrib>Huang, Kaizhu</creatorcontrib><creatorcontrib>Zhang, Rui</creatorcontrib><creatorcontrib>Hussain, Amir</creatorcontrib><title>Learning from Few Samples with Memory Network</title><title>Cognitive computation</title><addtitle>Cogn Comput</addtitle><description>Neural networks (NN) have achieved great successes in pattern recognition and machine learning. However, the success of a NN usually relies on the provision of a sufficiently large number of data samples as training data. When fed with a limited data set, a NN’s performance may be degraded significantly. In this paper, a novel NN structure is proposed called a memory network. It is inspired by the cognitive mechanism of human beings, which can learn effectively, even from limited data. Taking advantage of the memory from previous samples, the new model achieves a remarkable improvement in performance when trained using limited data. The memory network is demonstrated here using the multi-layer perceptron (MLP) as a base model. However, it would be straightforward to extend the idea to other neural networks, e.g., convolutional neural networks (CNN). In this paper, the memory network structure is detailed, the training algorithm is presented, and a series of experiments are conducted to validate the proposed framework. Experimental results show that the proposed model outperforms traditional MLP-based models as well as other competitive algorithms in response to two real benchmark data sets.</description><subject>Algorithms</subject><subject>Artificial Intelligence</subject><subject>Artificial neural networks</subject><subject>Biomedical and Life Sciences</subject><subject>Biomedicine</subject><subject>Computation by Abstract Devices</subject><subject>Computational Biology/Bioinformatics</subject><subject>Datasets</subject><subject>Deep learning</subject><subject>Knowledge</subject><subject>Language</subject><subject>Machine learning</subject><subject>Memory</subject><subject>Multilayer perceptrons</subject><subject>Multilayers</subject><subject>Neural networks</subject><subject>Neurosciences</subject><subject>Optimization</subject><subject>Pattern recognition</subject><subject>Performance degradation</subject><issn>1866-9956</issn><issn>1866-9964</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2018</creationdate><recordtype>article</recordtype><recordid>eNp1kEFPwyAYhonRxDn9Ad5IPKNAgZajWZyaVD2oZ0ILzM61VOjSbL9elho9efrew_u8X_IAcEnwNcE4v4mEci4RJjmSHOdofwRmpBACSSnY8W_m4hScxbjGWHDJ6Qyg0urQNd0KuuBbuLQjfNVtv7ERjs3wAZ9s68MOPtth9OHzHJw4vYn24ufOwfvy7m3xgMqX-8fFbYnqjMsB1dYwwTMmtcix01abilFNjeN1SlzrAjNhdO0qmRXcOU4rgytiRI21k4Jkc3A17fbBf21tHNTab0OXXioqiWQyZ0SkFpladfAxButUH5pWh50iWB2sqMmKSlbUwYraJ4ZOTEzdbmXD3_L_0DfGAWWL</recordid><startdate>20180201</startdate><enddate>20180201</enddate><creator>Zhang, Shufei</creator><creator>Huang, Kaizhu</creator><creator>Zhang, Rui</creator><creator>Hussain, Amir</creator><general>Springer US</general><general>Springer Nature B.V</general><scope>AAYXX</scope><scope>CITATION</scope><scope>8FE</scope><scope>8FG</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>GNUQQ</scope><scope>HCIFZ</scope><scope>JQ2</scope><scope>K7-</scope><scope>P62</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope></search><sort><creationdate>20180201</creationdate><title>Learning from Few Samples with Memory Network</title><author>Zhang, Shufei ; Huang, Kaizhu ; Zhang, Rui ; Hussain, Amir</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c359t-ced465349a670faeadb42a2df5cdb45aa8046dacfb9385ff52bd0b1d6c0af9613</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2018</creationdate><topic>Algorithms</topic><topic>Artificial Intelligence</topic><topic>Artificial neural networks</topic><topic>Biomedical and Life Sciences</topic><topic>Biomedicine</topic><topic>Computation by Abstract Devices</topic><topic>Computational Biology/Bioinformatics</topic><topic>Datasets</topic><topic>Deep learning</topic><topic>Knowledge</topic><topic>Language</topic><topic>Machine learning</topic><topic>Memory</topic><topic>Multilayer perceptrons</topic><topic>Multilayers</topic><topic>Neural networks</topic><topic>Neurosciences</topic><topic>Optimization</topic><topic>Pattern recognition</topic><topic>Performance degradation</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Zhang, Shufei</creatorcontrib><creatorcontrib>Huang, Kaizhu</creatorcontrib><creatorcontrib>Zhang, Rui</creatorcontrib><creatorcontrib>Hussain, Amir</creatorcontrib><collection>CrossRef</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>ProQuest Central</collection><collection>Advanced Technologies & Aerospace Collection</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>ProQuest Central Student</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Computer Science Collection</collection><collection>Computer Science Database</collection><collection>ProQuest Advanced Technologies & Aerospace Collection</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><jtitle>Cognitive computation</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Zhang, Shufei</au><au>Huang, Kaizhu</au><au>Zhang, Rui</au><au>Hussain, Amir</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Learning from Few Samples with Memory Network</atitle><jtitle>Cognitive computation</jtitle><stitle>Cogn Comput</stitle><date>2018-02-01</date><risdate>2018</risdate><volume>10</volume><issue>1</issue><spage>15</spage><epage>22</epage><pages>15-22</pages><issn>1866-9956</issn><eissn>1866-9964</eissn><abstract>Neural networks (NN) have achieved great successes in pattern recognition and machine learning. However, the success of a NN usually relies on the provision of a sufficiently large number of data samples as training data. When fed with a limited data set, a NN’s performance may be degraded significantly. In this paper, a novel NN structure is proposed called a memory network. It is inspired by the cognitive mechanism of human beings, which can learn effectively, even from limited data. Taking advantage of the memory from previous samples, the new model achieves a remarkable improvement in performance when trained using limited data. The memory network is demonstrated here using the multi-layer perceptron (MLP) as a base model. However, it would be straightforward to extend the idea to other neural networks, e.g., convolutional neural networks (CNN). In this paper, the memory network structure is detailed, the training algorithm is presented, and a series of experiments are conducted to validate the proposed framework. Experimental results show that the proposed model outperforms traditional MLP-based models as well as other competitive algorithms in response to two real benchmark data sets.</abstract><cop>New York</cop><pub>Springer US</pub><doi>10.1007/s12559-017-9507-z</doi><tpages>8</tpages><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 1866-9956 |
ispartof | Cognitive computation, 2018-02, Vol.10 (1), p.15-22 |
issn | 1866-9956 1866-9964 |
language | eng |
recordid | cdi_proquest_journals_2919497416 |
source | Springer Nature |
subjects | Algorithms Artificial Intelligence Artificial neural networks Biomedical and Life Sciences Biomedicine Computation by Abstract Devices Computational Biology/Bioinformatics Datasets Deep learning Knowledge Language Machine learning Memory Multilayer perceptrons Multilayers Neural networks Neurosciences Optimization Pattern recognition Performance degradation |
title | Learning from Few Samples with Memory Network |
url | http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-19T20%3A18%3A15IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Learning%20from%20Few%20Samples%20with%20Memory%20Network&rft.jtitle=Cognitive%20computation&rft.au=Zhang,%20Shufei&rft.date=2018-02-01&rft.volume=10&rft.issue=1&rft.spage=15&rft.epage=22&rft.pages=15-22&rft.issn=1866-9956&rft.eissn=1866-9964&rft_id=info:doi/10.1007/s12559-017-9507-z&rft_dat=%3Cproquest_cross%3E2919497416%3C/proquest_cross%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c359t-ced465349a670faeadb42a2df5cdb45aa8046dacfb9385ff52bd0b1d6c0af9613%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=2919497416&rft_id=info:pmid/&rfr_iscdi=true |