Loading…
A covariance matrix adaptation evolution strategy in reproducing kernel Hilbert space
The covariance matrix adaptation evolution strategy (CMA-ES) is an efficient derivative-free optimization algorithm. It optimizes a black-box objective function over a well-defined parameter space in which feature functions are often defined manually. Therefore, the performance of those techniques s...
Saved in:
Published in: | Genetic programming and evolvable machines 2019-12, Vol.20 (4), p.479-501 |
---|---|
Main Authors: | , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
cited_by | cdi_FETCH-LOGICAL-c363t-b8ad9bde01507d90ba4903666a7cf2dc58c89f3f610084002db2209db2ff606a3 |
---|---|
cites | cdi_FETCH-LOGICAL-c363t-b8ad9bde01507d90ba4903666a7cf2dc58c89f3f610084002db2209db2ff606a3 |
container_end_page | 501 |
container_issue | 4 |
container_start_page | 479 |
container_title | Genetic programming and evolvable machines |
container_volume | 20 |
creator | Dang, Viet-Hung Vien, Ngo Anh Chung, TaeChoong |
description | The covariance matrix adaptation evolution strategy (CMA-ES) is an efficient derivative-free optimization algorithm. It optimizes a black-box objective function over a well-defined parameter space in which feature functions are often defined manually. Therefore, the performance of those techniques strongly depends on the quality of the chosen features or the underlying parametric function space. Hence, enabling CMA-ES to optimize on a more complex and general function class has long been desired. In this paper, we consider
modeling
the input spaces in black-box optimization
non-parametrically
in reproducing kernel Hilbert spaces (RKHS). This modeling leads to a
functional optimisation
problem whose domain is a RKHS function space that enables optimisation in a very rich function class. We propose CMA-ES-RKHS, a generalized CMA-ES framework that is able to carry out black-box functional optimisation in RKHS. A search distribution on non-parametric function spaces, represented as a Gaussian process, is adapted by updating both its mean function and covariance operator. Adaptive and sparse representation of the mean function and the covariance operator can be retained for efficient computation in the updates and evaluations of CMA-ES-RKHS by resorting to sparsification. We will also show how to apply our new black-box framework to search for an optimum policy in reinforcement learning in which policies are represented as functions in a RKHS. CMA-ES-RKHS is evaluated on two functional optimization problems and two bench-marking reinforcement learning domains. |
doi_str_mv | 10.1007/s10710-019-09357-1 |
format | article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2305102278</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2305102278</sourcerecordid><originalsourceid>FETCH-LOGICAL-c363t-b8ad9bde01507d90ba4903666a7cf2dc58c89f3f610084002db2209db2ff606a3</originalsourceid><addsrcrecordid>eNp9kD9PwzAQxS0EEqXwBZgsMRvOdm3HY1XxT6rEQmfLcZwqJU2C7VT022MaJDaWuxvee3f3Q-iWwj0FUA-RgqJAgGoCmgtF6BmaUaE4UZKz8zzzQhMmlLxEVzHuAKhkQs_QZoldf7ChsZ3zeG9TaL6wreyQbGr6DvtD346nKaZgk98ecdPh4IfQV6Nrui3-8KHzLX5p2tKHhONgnb9GF7Vto7_57XO0eXp8X72Q9dvz62q5Jo5LnkhZ2EqXlQcqQFUaSrvQwKWUVrmaVU4UrtA1r2X-sVgAsKpkDHSudS1BWj5Hd1NuPudz9DGZXT-GLq80jIOgwJgqsopNKhf6GIOvzRCavQ1HQ8H84DMTPpPxmRM-Q7OJT6aYxd3Wh7_of1zfhKNzYg</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2305102278</pqid></control><display><type>article</type><title>A covariance matrix adaptation evolution strategy in reproducing kernel Hilbert space</title><source>Springer Link</source><creator>Dang, Viet-Hung ; Vien, Ngo Anh ; Chung, TaeChoong</creator><creatorcontrib>Dang, Viet-Hung ; Vien, Ngo Anh ; Chung, TaeChoong</creatorcontrib><description>The covariance matrix adaptation evolution strategy (CMA-ES) is an efficient derivative-free optimization algorithm. It optimizes a black-box objective function over a well-defined parameter space in which feature functions are often defined manually. Therefore, the performance of those techniques strongly depends on the quality of the chosen features or the underlying parametric function space. Hence, enabling CMA-ES to optimize on a more complex and general function class has long been desired. In this paper, we consider
modeling
the input spaces in black-box optimization
non-parametrically
in reproducing kernel Hilbert spaces (RKHS). This modeling leads to a
functional optimisation
problem whose domain is a RKHS function space that enables optimisation in a very rich function class. We propose CMA-ES-RKHS, a generalized CMA-ES framework that is able to carry out black-box functional optimisation in RKHS. A search distribution on non-parametric function spaces, represented as a Gaussian process, is adapted by updating both its mean function and covariance operator. Adaptive and sparse representation of the mean function and the covariance operator can be retained for efficient computation in the updates and evaluations of CMA-ES-RKHS by resorting to sparsification. We will also show how to apply our new black-box framework to search for an optimum policy in reinforcement learning in which policies are represented as functions in a RKHS. CMA-ES-RKHS is evaluated on two functional optimization problems and two bench-marking reinforcement learning domains.</description><identifier>ISSN: 1389-2576</identifier><identifier>EISSN: 1573-7632</identifier><identifier>DOI: 10.1007/s10710-019-09357-1</identifier><language>eng</language><publisher>New York: Springer US</publisher><subject>Adaptation ; Algorithms ; Artificial Intelligence ; Biological evolution ; Biomedical Engineering and Bioengineering ; Compilers ; Computer Science ; Covariance matrix ; Domains ; Electrical Engineering ; Function space ; Functionals ; Gaussian process ; Hilbert space ; Interpreters ; Kernels ; Machine learning ; Operators (mathematics) ; Optimization ; Programming Languages ; Programming Techniques ; Software Engineering/Programming and Operating Systems</subject><ispartof>Genetic programming and evolvable machines, 2019-12, Vol.20 (4), p.479-501</ispartof><rights>The Author(s) 2019</rights><rights>Copyright Springer Nature B.V. 2019</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c363t-b8ad9bde01507d90ba4903666a7cf2dc58c89f3f610084002db2209db2ff606a3</citedby><cites>FETCH-LOGICAL-c363t-b8ad9bde01507d90ba4903666a7cf2dc58c89f3f610084002db2209db2ff606a3</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,780,784,27924,27925</link.rule.ids></links><search><creatorcontrib>Dang, Viet-Hung</creatorcontrib><creatorcontrib>Vien, Ngo Anh</creatorcontrib><creatorcontrib>Chung, TaeChoong</creatorcontrib><title>A covariance matrix adaptation evolution strategy in reproducing kernel Hilbert space</title><title>Genetic programming and evolvable machines</title><addtitle>Genet Program Evolvable Mach</addtitle><description>The covariance matrix adaptation evolution strategy (CMA-ES) is an efficient derivative-free optimization algorithm. It optimizes a black-box objective function over a well-defined parameter space in which feature functions are often defined manually. Therefore, the performance of those techniques strongly depends on the quality of the chosen features or the underlying parametric function space. Hence, enabling CMA-ES to optimize on a more complex and general function class has long been desired. In this paper, we consider
modeling
the input spaces in black-box optimization
non-parametrically
in reproducing kernel Hilbert spaces (RKHS). This modeling leads to a
functional optimisation
problem whose domain is a RKHS function space that enables optimisation in a very rich function class. We propose CMA-ES-RKHS, a generalized CMA-ES framework that is able to carry out black-box functional optimisation in RKHS. A search distribution on non-parametric function spaces, represented as a Gaussian process, is adapted by updating both its mean function and covariance operator. Adaptive and sparse representation of the mean function and the covariance operator can be retained for efficient computation in the updates and evaluations of CMA-ES-RKHS by resorting to sparsification. We will also show how to apply our new black-box framework to search for an optimum policy in reinforcement learning in which policies are represented as functions in a RKHS. CMA-ES-RKHS is evaluated on two functional optimization problems and two bench-marking reinforcement learning domains.</description><subject>Adaptation</subject><subject>Algorithms</subject><subject>Artificial Intelligence</subject><subject>Biological evolution</subject><subject>Biomedical Engineering and Bioengineering</subject><subject>Compilers</subject><subject>Computer Science</subject><subject>Covariance matrix</subject><subject>Domains</subject><subject>Electrical Engineering</subject><subject>Function space</subject><subject>Functionals</subject><subject>Gaussian process</subject><subject>Hilbert space</subject><subject>Interpreters</subject><subject>Kernels</subject><subject>Machine learning</subject><subject>Operators (mathematics)</subject><subject>Optimization</subject><subject>Programming Languages</subject><subject>Programming Techniques</subject><subject>Software Engineering/Programming and Operating Systems</subject><issn>1389-2576</issn><issn>1573-7632</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2019</creationdate><recordtype>article</recordtype><recordid>eNp9kD9PwzAQxS0EEqXwBZgsMRvOdm3HY1XxT6rEQmfLcZwqJU2C7VT022MaJDaWuxvee3f3Q-iWwj0FUA-RgqJAgGoCmgtF6BmaUaE4UZKz8zzzQhMmlLxEVzHuAKhkQs_QZoldf7ChsZ3zeG9TaL6wreyQbGr6DvtD346nKaZgk98ecdPh4IfQV6Nrui3-8KHzLX5p2tKHhONgnb9GF7Vto7_57XO0eXp8X72Q9dvz62q5Jo5LnkhZ2EqXlQcqQFUaSrvQwKWUVrmaVU4UrtA1r2X-sVgAsKpkDHSudS1BWj5Hd1NuPudz9DGZXT-GLq80jIOgwJgqsopNKhf6GIOvzRCavQ1HQ8H84DMTPpPxmRM-Q7OJT6aYxd3Wh7_of1zfhKNzYg</recordid><startdate>20191201</startdate><enddate>20191201</enddate><creator>Dang, Viet-Hung</creator><creator>Vien, Ngo Anh</creator><creator>Chung, TaeChoong</creator><general>Springer US</general><general>Springer Nature B.V</general><scope>C6C</scope><scope>AAYXX</scope><scope>CITATION</scope></search><sort><creationdate>20191201</creationdate><title>A covariance matrix adaptation evolution strategy in reproducing kernel Hilbert space</title><author>Dang, Viet-Hung ; Vien, Ngo Anh ; Chung, TaeChoong</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c363t-b8ad9bde01507d90ba4903666a7cf2dc58c89f3f610084002db2209db2ff606a3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2019</creationdate><topic>Adaptation</topic><topic>Algorithms</topic><topic>Artificial Intelligence</topic><topic>Biological evolution</topic><topic>Biomedical Engineering and Bioengineering</topic><topic>Compilers</topic><topic>Computer Science</topic><topic>Covariance matrix</topic><topic>Domains</topic><topic>Electrical Engineering</topic><topic>Function space</topic><topic>Functionals</topic><topic>Gaussian process</topic><topic>Hilbert space</topic><topic>Interpreters</topic><topic>Kernels</topic><topic>Machine learning</topic><topic>Operators (mathematics)</topic><topic>Optimization</topic><topic>Programming Languages</topic><topic>Programming Techniques</topic><topic>Software Engineering/Programming and Operating Systems</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Dang, Viet-Hung</creatorcontrib><creatorcontrib>Vien, Ngo Anh</creatorcontrib><creatorcontrib>Chung, TaeChoong</creatorcontrib><collection>Springer Nature OA Free Journals</collection><collection>CrossRef</collection><jtitle>Genetic programming and evolvable machines</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Dang, Viet-Hung</au><au>Vien, Ngo Anh</au><au>Chung, TaeChoong</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>A covariance matrix adaptation evolution strategy in reproducing kernel Hilbert space</atitle><jtitle>Genetic programming and evolvable machines</jtitle><stitle>Genet Program Evolvable Mach</stitle><date>2019-12-01</date><risdate>2019</risdate><volume>20</volume><issue>4</issue><spage>479</spage><epage>501</epage><pages>479-501</pages><issn>1389-2576</issn><eissn>1573-7632</eissn><abstract>The covariance matrix adaptation evolution strategy (CMA-ES) is an efficient derivative-free optimization algorithm. It optimizes a black-box objective function over a well-defined parameter space in which feature functions are often defined manually. Therefore, the performance of those techniques strongly depends on the quality of the chosen features or the underlying parametric function space. Hence, enabling CMA-ES to optimize on a more complex and general function class has long been desired. In this paper, we consider
modeling
the input spaces in black-box optimization
non-parametrically
in reproducing kernel Hilbert spaces (RKHS). This modeling leads to a
functional optimisation
problem whose domain is a RKHS function space that enables optimisation in a very rich function class. We propose CMA-ES-RKHS, a generalized CMA-ES framework that is able to carry out black-box functional optimisation in RKHS. A search distribution on non-parametric function spaces, represented as a Gaussian process, is adapted by updating both its mean function and covariance operator. Adaptive and sparse representation of the mean function and the covariance operator can be retained for efficient computation in the updates and evaluations of CMA-ES-RKHS by resorting to sparsification. We will also show how to apply our new black-box framework to search for an optimum policy in reinforcement learning in which policies are represented as functions in a RKHS. CMA-ES-RKHS is evaluated on two functional optimization problems and two bench-marking reinforcement learning domains.</abstract><cop>New York</cop><pub>Springer US</pub><doi>10.1007/s10710-019-09357-1</doi><tpages>23</tpages><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 1389-2576 |
ispartof | Genetic programming and evolvable machines, 2019-12, Vol.20 (4), p.479-501 |
issn | 1389-2576 1573-7632 |
language | eng |
recordid | cdi_proquest_journals_2305102278 |
source | Springer Link |
subjects | Adaptation Algorithms Artificial Intelligence Biological evolution Biomedical Engineering and Bioengineering Compilers Computer Science Covariance matrix Domains Electrical Engineering Function space Functionals Gaussian process Hilbert space Interpreters Kernels Machine learning Operators (mathematics) Optimization Programming Languages Programming Techniques Software Engineering/Programming and Operating Systems |
title | A covariance matrix adaptation evolution strategy in reproducing kernel Hilbert space |
url | http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-26T08%3A29%3A32IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=A%20covariance%20matrix%20adaptation%20evolution%20strategy%20in%20reproducing%20kernel%20Hilbert%20space&rft.jtitle=Genetic%20programming%20and%20evolvable%20machines&rft.au=Dang,%20Viet-Hung&rft.date=2019-12-01&rft.volume=20&rft.issue=4&rft.spage=479&rft.epage=501&rft.pages=479-501&rft.issn=1389-2576&rft.eissn=1573-7632&rft_id=info:doi/10.1007/s10710-019-09357-1&rft_dat=%3Cproquest_cross%3E2305102278%3C/proquest_cross%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c363t-b8ad9bde01507d90ba4903666a7cf2dc58c89f3f610084002db2209db2ff606a3%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=2305102278&rft_id=info:pmid/&rfr_iscdi=true |