Loading…
Balanced Training of Energy-Based Models with Adaptive Flow Sampling
Energy-based models (EBMs) are versatile density estimation models that directly parameterize an unnormalized log density. Although very flexible, EBMs lack a specified normalization constant of the model, making the likelihood of the model computationally intractable. Several approximate samplers a...
Saved in:
Published in: | arXiv.org 2024-02 |
---|---|
Main Authors: | , , |
Format: | Article |
Language: | English |
Subjects: | |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
cited_by | |
---|---|
cites | |
container_end_page | |
container_issue | |
container_start_page | |
container_title | arXiv.org |
container_volume | |
creator | Grenioux, Louis Moulines, Éric Gabrié, Marylou |
description | Energy-based models (EBMs) are versatile density estimation models that directly parameterize an unnormalized log density. Although very flexible, EBMs lack a specified normalization constant of the model, making the likelihood of the model computationally intractable. Several approximate samplers and variational inference techniques have been proposed to estimate the likelihood gradients for training. These techniques have shown promising results in generating samples, but little attention has been paid to the statistical accuracy of the estimated density, such as determining the relative importance of different classes in a dataset. In this work, we propose a new maximum likelihood training algorithm for EBMs that uses a different type of generative model, normalizing flows (NF), which have recently been proposed to facilitate sampling. Our method fits an NF to an EBM during training so that an NF-assisted sampling scheme provides an accurate gradient for the EBMs at all times, ultimately leading to a fast sampler for generating new data. |
format | article |
fullrecord | <record><control><sourceid>proquest</sourceid><recordid>TN_cdi_proquest_journals_2821737187</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2821737187</sourcerecordid><originalsourceid>FETCH-proquest_journals_28217371873</originalsourceid><addsrcrecordid>eNqNi7EOgjAUABsTE4nyDy9xJoFXsayiEBcn2c2LFCypLbYg8e9l8AOcbri7BQuQ8yTKdogrFnrfxXGMe4FpygN2ykmTucsaKkfKKNOCbaAw0rWfKCc_i4utpfYwqeEBh5r6Qb0llNpOcKVnr-dlw5YNaS_DH9dsWxbV8Rz1zr5G6YdbZ0dnZnXDDBPBRZIJ_l_1BdkYOdc</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2821737187</pqid></control><display><type>article</type><title>Balanced Training of Energy-Based Models with Adaptive Flow Sampling</title><source>Publicly Available Content Database (Proquest) (PQ_SDU_P3)</source><creator>Grenioux, Louis ; Moulines, Éric ; Gabrié, Marylou</creator><creatorcontrib>Grenioux, Louis ; Moulines, Éric ; Gabrié, Marylou</creatorcontrib><description>Energy-based models (EBMs) are versatile density estimation models that directly parameterize an unnormalized log density. Although very flexible, EBMs lack a specified normalization constant of the model, making the likelihood of the model computationally intractable. Several approximate samplers and variational inference techniques have been proposed to estimate the likelihood gradients for training. These techniques have shown promising results in generating samples, but little attention has been paid to the statistical accuracy of the estimated density, such as determining the relative importance of different classes in a dataset. In this work, we propose a new maximum likelihood training algorithm for EBMs that uses a different type of generative model, normalizing flows (NF), which have recently been proposed to facilitate sampling. Our method fits an NF to an EBM during training so that an NF-assisted sampling scheme provides an accurate gradient for the EBMs at all times, ultimately leading to a fast sampler for generating new data.</description><identifier>EISSN: 2331-8422</identifier><language>eng</language><publisher>Ithaca: Cornell University Library, arXiv.org</publisher><subject>Adaptive sampling ; Algorithms ; Density ; Samplers</subject><ispartof>arXiv.org, 2024-02</ispartof><rights>2024. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://www.proquest.com/docview/2821737187?pq-origsite=primo$$EHTML$$P50$$Gproquest$$Hfree_for_read</linktohtml><link.rule.ids>776,780,25730,36988,44565</link.rule.ids></links><search><creatorcontrib>Grenioux, Louis</creatorcontrib><creatorcontrib>Moulines, Éric</creatorcontrib><creatorcontrib>Gabrié, Marylou</creatorcontrib><title>Balanced Training of Energy-Based Models with Adaptive Flow Sampling</title><title>arXiv.org</title><description>Energy-based models (EBMs) are versatile density estimation models that directly parameterize an unnormalized log density. Although very flexible, EBMs lack a specified normalization constant of the model, making the likelihood of the model computationally intractable. Several approximate samplers and variational inference techniques have been proposed to estimate the likelihood gradients for training. These techniques have shown promising results in generating samples, but little attention has been paid to the statistical accuracy of the estimated density, such as determining the relative importance of different classes in a dataset. In this work, we propose a new maximum likelihood training algorithm for EBMs that uses a different type of generative model, normalizing flows (NF), which have recently been proposed to facilitate sampling. Our method fits an NF to an EBM during training so that an NF-assisted sampling scheme provides an accurate gradient for the EBMs at all times, ultimately leading to a fast sampler for generating new data.</description><subject>Adaptive sampling</subject><subject>Algorithms</subject><subject>Density</subject><subject>Samplers</subject><issn>2331-8422</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><sourceid>PIMPY</sourceid><recordid>eNqNi7EOgjAUABsTE4nyDy9xJoFXsayiEBcn2c2LFCypLbYg8e9l8AOcbri7BQuQ8yTKdogrFnrfxXGMe4FpygN2ykmTucsaKkfKKNOCbaAw0rWfKCc_i4utpfYwqeEBh5r6Qb0llNpOcKVnr-dlw5YNaS_DH9dsWxbV8Rz1zr5G6YdbZ0dnZnXDDBPBRZIJ_l_1BdkYOdc</recordid><startdate>20240218</startdate><enddate>20240218</enddate><creator>Grenioux, Louis</creator><creator>Moulines, Éric</creator><creator>Gabrié, Marylou</creator><general>Cornell University Library, arXiv.org</general><scope>8FE</scope><scope>8FG</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>HCIFZ</scope><scope>L6V</scope><scope>M7S</scope><scope>PHGZM</scope><scope>PHGZT</scope><scope>PIMPY</scope><scope>PKEHL</scope><scope>PQEST</scope><scope>PQGLB</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PTHSS</scope></search><sort><creationdate>20240218</creationdate><title>Balanced Training of Energy-Based Models with Adaptive Flow Sampling</title><author>Grenioux, Louis ; Moulines, Éric ; Gabrié, Marylou</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-proquest_journals_28217371873</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Adaptive sampling</topic><topic>Algorithms</topic><topic>Density</topic><topic>Samplers</topic><toplevel>online_resources</toplevel><creatorcontrib>Grenioux, Louis</creatorcontrib><creatorcontrib>Moulines, Éric</creatorcontrib><creatorcontrib>Gabrié, Marylou</creatorcontrib><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>Materials Science & Engineering Collection</collection><collection>ProQuest Central (Alumni)</collection><collection>ProQuest Central UK/Ireland</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central</collection><collection>SciTech Premium Collection (Proquest) (PQ_SDU_P3)</collection><collection>ProQuest Engineering Collection</collection><collection>ProQuest Engineering Database</collection><collection>ProQuest Central (New)</collection><collection>ProQuest One Academic (New)</collection><collection>Publicly Available Content Database (Proquest) (PQ_SDU_P3)</collection><collection>ProQuest One Academic Middle East (New)</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Applied & Life Sciences</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>Engineering collection</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Grenioux, Louis</au><au>Moulines, Éric</au><au>Gabrié, Marylou</au><format>book</format><genre>document</genre><ristype>GEN</ristype><atitle>Balanced Training of Energy-Based Models with Adaptive Flow Sampling</atitle><jtitle>arXiv.org</jtitle><date>2024-02-18</date><risdate>2024</risdate><eissn>2331-8422</eissn><abstract>Energy-based models (EBMs) are versatile density estimation models that directly parameterize an unnormalized log density. Although very flexible, EBMs lack a specified normalization constant of the model, making the likelihood of the model computationally intractable. Several approximate samplers and variational inference techniques have been proposed to estimate the likelihood gradients for training. These techniques have shown promising results in generating samples, but little attention has been paid to the statistical accuracy of the estimated density, such as determining the relative importance of different classes in a dataset. In this work, we propose a new maximum likelihood training algorithm for EBMs that uses a different type of generative model, normalizing flows (NF), which have recently been proposed to facilitate sampling. Our method fits an NF to an EBM during training so that an NF-assisted sampling scheme provides an accurate gradient for the EBMs at all times, ultimately leading to a fast sampler for generating new data.</abstract><cop>Ithaca</cop><pub>Cornell University Library, arXiv.org</pub><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | EISSN: 2331-8422 |
ispartof | arXiv.org, 2024-02 |
issn | 2331-8422 |
language | eng |
recordid | cdi_proquest_journals_2821737187 |
source | Publicly Available Content Database (Proquest) (PQ_SDU_P3) |
subjects | Adaptive sampling Algorithms Density Samplers |
title | Balanced Training of Energy-Based Models with Adaptive Flow Sampling |
url | http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-24T17%3A01%3A12IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=document&rft.atitle=Balanced%20Training%20of%20Energy-Based%20Models%20with%20Adaptive%20Flow%20Sampling&rft.jtitle=arXiv.org&rft.au=Grenioux,%20Louis&rft.date=2024-02-18&rft.eissn=2331-8422&rft_id=info:doi/&rft_dat=%3Cproquest%3E2821737187%3C/proquest%3E%3Cgrp_id%3Ecdi_FETCH-proquest_journals_28217371873%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=2821737187&rft_id=info:pmid/&rfr_iscdi=true |