Loading…

HardCoRe-NAS: Hard Constrained diffeRentiable Neural Architecture Search

Realistic use of neural networks often requires adhering to multiple constraints on latency, energy and memory among others. A popular approach to find fitting networks is through constrained Neural Architecture Search (NAS), however, previous methods enforce the constraint only softly. Therefore, t...

Full description

Saved in:
Bibliographic Details
Published in:arXiv.org 2021-02
Main Authors: Nayman, Niv, Aflalo, Yonathan, Noy, Asaf, Zelnik-Manor, Lihi
Format: Article
Language:English
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by
cites
container_end_page
container_issue
container_start_page
container_title arXiv.org
container_volume
creator Nayman, Niv
Aflalo, Yonathan
Noy, Asaf
Zelnik-Manor, Lihi
description Realistic use of neural networks often requires adhering to multiple constraints on latency, energy and memory among others. A popular approach to find fitting networks is through constrained Neural Architecture Search (NAS), however, previous methods enforce the constraint only softly. Therefore, the resulting networks do not exactly adhere to the resource constraint and their accuracy is harmed. In this work we resolve this by introducing Hard Constrained diffeRentiable NAS (HardCoRe-NAS), that is based on an accurate formulation of the expected resource requirement and a scalable search method that satisfies the hard constraint throughout the search. Our experiments show that HardCoRe-NAS generates state-of-the-art architectures, surpassing other NAS methods, while strictly satisfying the hard resource constraints without any tuning required.
format article
fullrecord <record><control><sourceid>proquest</sourceid><recordid>TN_cdi_proquest_journals_2492818383</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2492818383</sourcerecordid><originalsourceid>FETCH-proquest_journals_24928183833</originalsourceid><addsrcrecordid>eNqNitEKgjAUQEcQJOU_DHoW9E5r9SZS-OSD9S5LrzSRre62_8-gD-jpcDhnxSIQIktkDrBhsXNTmqZwOEJRiIjVtaKhsi0mTXk786_xyhrnSWmDAx_0OGKLxmv1mJE3GEjNvKT-qT32PhDyG6pFd2w9qtlh_OOW7a-Xe1UnL7LvgM53kw1kltRBfgKZSSGF-O_6AMXfO0s</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2492818383</pqid></control><display><type>article</type><title>HardCoRe-NAS: Hard Constrained diffeRentiable Neural Architecture Search</title><source>Publicly Available Content Database (Proquest) (PQ_SDU_P3)</source><creator>Nayman, Niv ; Aflalo, Yonathan ; Noy, Asaf ; Zelnik-Manor, Lihi</creator><creatorcontrib>Nayman, Niv ; Aflalo, Yonathan ; Noy, Asaf ; Zelnik-Manor, Lihi</creatorcontrib><description>Realistic use of neural networks often requires adhering to multiple constraints on latency, energy and memory among others. A popular approach to find fitting networks is through constrained Neural Architecture Search (NAS), however, previous methods enforce the constraint only softly. Therefore, the resulting networks do not exactly adhere to the resource constraint and their accuracy is harmed. In this work we resolve this by introducing Hard Constrained diffeRentiable NAS (HardCoRe-NAS), that is based on an accurate formulation of the expected resource requirement and a scalable search method that satisfies the hard constraint throughout the search. Our experiments show that HardCoRe-NAS generates state-of-the-art architectures, surpassing other NAS methods, while strictly satisfying the hard resource constraints without any tuning required.</description><identifier>EISSN: 2331-8422</identifier><language>eng</language><publisher>Ithaca: Cornell University Library, arXiv.org</publisher><subject>Computer architecture ; Constraints ; Network latency ; Neural networks ; Search methods</subject><ispartof>arXiv.org, 2021-02</ispartof><rights>2021. This work is published under http://arxiv.org/licenses/nonexclusive-distrib/1.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://www.proquest.com/docview/2492818383?pq-origsite=primo$$EHTML$$P50$$Gproquest$$Hfree_for_read</linktohtml><link.rule.ids>780,784,25753,37012,44590</link.rule.ids></links><search><creatorcontrib>Nayman, Niv</creatorcontrib><creatorcontrib>Aflalo, Yonathan</creatorcontrib><creatorcontrib>Noy, Asaf</creatorcontrib><creatorcontrib>Zelnik-Manor, Lihi</creatorcontrib><title>HardCoRe-NAS: Hard Constrained diffeRentiable Neural Architecture Search</title><title>arXiv.org</title><description>Realistic use of neural networks often requires adhering to multiple constraints on latency, energy and memory among others. A popular approach to find fitting networks is through constrained Neural Architecture Search (NAS), however, previous methods enforce the constraint only softly. Therefore, the resulting networks do not exactly adhere to the resource constraint and their accuracy is harmed. In this work we resolve this by introducing Hard Constrained diffeRentiable NAS (HardCoRe-NAS), that is based on an accurate formulation of the expected resource requirement and a scalable search method that satisfies the hard constraint throughout the search. Our experiments show that HardCoRe-NAS generates state-of-the-art architectures, surpassing other NAS methods, while strictly satisfying the hard resource constraints without any tuning required.</description><subject>Computer architecture</subject><subject>Constraints</subject><subject>Network latency</subject><subject>Neural networks</subject><subject>Search methods</subject><issn>2331-8422</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2021</creationdate><recordtype>article</recordtype><sourceid>PIMPY</sourceid><recordid>eNqNitEKgjAUQEcQJOU_DHoW9E5r9SZS-OSD9S5LrzSRre62_8-gD-jpcDhnxSIQIktkDrBhsXNTmqZwOEJRiIjVtaKhsi0mTXk786_xyhrnSWmDAx_0OGKLxmv1mJE3GEjNvKT-qT32PhDyG6pFd2w9qtlh_OOW7a-Xe1UnL7LvgM53kw1kltRBfgKZSSGF-O_6AMXfO0s</recordid><startdate>20210223</startdate><enddate>20210223</enddate><creator>Nayman, Niv</creator><creator>Aflalo, Yonathan</creator><creator>Noy, Asaf</creator><creator>Zelnik-Manor, Lihi</creator><general>Cornell University Library, arXiv.org</general><scope>8FE</scope><scope>8FG</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>HCIFZ</scope><scope>L6V</scope><scope>M7S</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PTHSS</scope></search><sort><creationdate>20210223</creationdate><title>HardCoRe-NAS: Hard Constrained diffeRentiable Neural Architecture Search</title><author>Nayman, Niv ; Aflalo, Yonathan ; Noy, Asaf ; Zelnik-Manor, Lihi</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-proquest_journals_24928183833</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2021</creationdate><topic>Computer architecture</topic><topic>Constraints</topic><topic>Network latency</topic><topic>Neural networks</topic><topic>Search methods</topic><toplevel>online_resources</toplevel><creatorcontrib>Nayman, Niv</creatorcontrib><creatorcontrib>Aflalo, Yonathan</creatorcontrib><creatorcontrib>Noy, Asaf</creatorcontrib><creatorcontrib>Zelnik-Manor, Lihi</creatorcontrib><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>Materials Science &amp; Engineering Collection</collection><collection>ProQuest Central (Alumni)</collection><collection>ProQuest Central</collection><collection>ProQuest Central Essentials</collection><collection>AUTh Library subscriptions: ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>SciTech Premium Collection (Proquest) (PQ_SDU_P3)</collection><collection>ProQuest Engineering Collection</collection><collection>Engineering Database</collection><collection>Publicly Available Content Database (Proquest) (PQ_SDU_P3)</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>Engineering collection</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Nayman, Niv</au><au>Aflalo, Yonathan</au><au>Noy, Asaf</au><au>Zelnik-Manor, Lihi</au><format>book</format><genre>document</genre><ristype>GEN</ristype><atitle>HardCoRe-NAS: Hard Constrained diffeRentiable Neural Architecture Search</atitle><jtitle>arXiv.org</jtitle><date>2021-02-23</date><risdate>2021</risdate><eissn>2331-8422</eissn><abstract>Realistic use of neural networks often requires adhering to multiple constraints on latency, energy and memory among others. A popular approach to find fitting networks is through constrained Neural Architecture Search (NAS), however, previous methods enforce the constraint only softly. Therefore, the resulting networks do not exactly adhere to the resource constraint and their accuracy is harmed. In this work we resolve this by introducing Hard Constrained diffeRentiable NAS (HardCoRe-NAS), that is based on an accurate formulation of the expected resource requirement and a scalable search method that satisfies the hard constraint throughout the search. Our experiments show that HardCoRe-NAS generates state-of-the-art architectures, surpassing other NAS methods, while strictly satisfying the hard resource constraints without any tuning required.</abstract><cop>Ithaca</cop><pub>Cornell University Library, arXiv.org</pub><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier EISSN: 2331-8422
ispartof arXiv.org, 2021-02
issn 2331-8422
language eng
recordid cdi_proquest_journals_2492818383
source Publicly Available Content Database (Proquest) (PQ_SDU_P3)
subjects Computer architecture
Constraints
Network latency
Neural networks
Search methods
title HardCoRe-NAS: Hard Constrained diffeRentiable Neural Architecture Search
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-28T22%3A55%3A03IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=document&rft.atitle=HardCoRe-NAS:%20Hard%20Constrained%20diffeRentiable%20Neural%20Architecture%20Search&rft.jtitle=arXiv.org&rft.au=Nayman,%20Niv&rft.date=2021-02-23&rft.eissn=2331-8422&rft_id=info:doi/&rft_dat=%3Cproquest%3E2492818383%3C/proquest%3E%3Cgrp_id%3Ecdi_FETCH-proquest_journals_24928183833%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=2492818383&rft_id=info:pmid/&rfr_iscdi=true