Loading…
Analysis of regularized least-squares in reproducing kernel Kreĭn spaces
In this paper, we study the asymptotic properties of regularized least squares with indefinite kernels in reproducing kernel Kreĭn spaces (RKKS). By introducing a bounded hyper-sphere constraint to such non-convex regularized risk minimization problem, we theoretically demonstrate that this problem...
Saved in:
Published in: | Machine learning 2021-06, Vol.110 (6), p.1145-1173 |
---|---|
Main Authors: | , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
cited_by | cdi_FETCH-LOGICAL-c400t-992b984aabc52ad849cc917096efa9027197314e45ff33e7408140f24ff47a023 |
---|---|
cites | cdi_FETCH-LOGICAL-c400t-992b984aabc52ad849cc917096efa9027197314e45ff33e7408140f24ff47a023 |
container_end_page | 1173 |
container_issue | 6 |
container_start_page | 1145 |
container_title | Machine learning |
container_volume | 110 |
creator | Liu, Fanghui Shi, Lei Huang, Xiaolin Yang, Jie Suykens, Johan A. K. |
description | In this paper, we study the asymptotic properties of regularized least squares with indefinite kernels in reproducing kernel Kreĭn spaces (RKKS). By introducing a bounded hyper-sphere constraint to such non-convex regularized risk minimization problem, we theoretically demonstrate that this problem has a globally optimal solution with a closed form on the sphere, which makes approximation analysis feasible in RKKS. Regarding to the original regularizer induced by the indefinite inner product, we modify traditional error decomposition techniques, prove convergence results for the introduced hypothesis error based on matrix perturbation theory, and derive learning rates of such regularized regression problem in RKKS. Under some conditions, the derived learning rates in RKKS are the same as that in reproducing kernel Hilbert spaces (RKHS). To the best of our knowledge, this is the first work on approximation analysis of regularized learning algorithms in RKKS. |
doi_str_mv | 10.1007/s10994-021-05955-2 |
format | article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2542532274</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2542532274</sourcerecordid><originalsourceid>FETCH-LOGICAL-c400t-992b984aabc52ad849cc917096efa9027197314e45ff33e7408140f24ff47a023</originalsourceid><addsrcrecordid>eNp9kEFOwzAQRS0EEqVwAVaRWBvGjh3Hy6oCWlGJDawtNx1XKSFpPc2i3IlbcDBcgsSO1Wg0_3_Nf4xdC7gVAOaOBFirOEjBQVutuTxhI6FNntZCn7IRlKXmhZD6nF0QbQBAFmUxYvNJ65sD1ZR1IYu47hsf6w9cZQ162nPa9T4iZXWbjtvYrfqqbtfZG8YWm-wp4tdnm9HWV0iX7Cz4hvDqd47Z68P9y3TGF8-P8-lkwSsFsOfWyqUtlffLSku_KpWtKisM2AKDtyCNsCYXCpUOIc_RKCiFgiBVCMp4kPmY3Qy56Z1dj7R3m66PqQU5qZXUuZRGJZUcVFXsiCIGt431u48HJ8AdkbkBmUvI3A8yd4zOBxMlcbvG-Bf9j-sbJgFuvg</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2542532274</pqid></control><display><type>article</type><title>Analysis of regularized least-squares in reproducing kernel Kreĭn spaces</title><source>Springer Nature</source><creator>Liu, Fanghui ; Shi, Lei ; Huang, Xiaolin ; Yang, Jie ; Suykens, Johan A. K.</creator><creatorcontrib>Liu, Fanghui ; Shi, Lei ; Huang, Xiaolin ; Yang, Jie ; Suykens, Johan A. K.</creatorcontrib><description>In this paper, we study the asymptotic properties of regularized least squares with indefinite kernels in reproducing kernel Kreĭn spaces (RKKS). By introducing a bounded hyper-sphere constraint to such non-convex regularized risk minimization problem, we theoretically demonstrate that this problem has a globally optimal solution with a closed form on the sphere, which makes approximation analysis feasible in RKKS. Regarding to the original regularizer induced by the indefinite inner product, we modify traditional error decomposition techniques, prove convergence results for the introduced hypothesis error based on matrix perturbation theory, and derive learning rates of such regularized regression problem in RKKS. Under some conditions, the derived learning rates in RKKS are the same as that in reproducing kernel Hilbert spaces (RKHS). To the best of our knowledge, this is the first work on approximation analysis of regularized learning algorithms in RKKS.</description><identifier>ISSN: 0885-6125</identifier><identifier>EISSN: 1573-0565</identifier><identifier>DOI: 10.1007/s10994-021-05955-2</identifier><language>eng</language><publisher>New York: Springer US</publisher><subject>Algorithms ; Approximation ; Artificial Intelligence ; Asymptotic properties ; Computer Science ; Control ; Hilbert space ; Kernels ; Least squares ; Machine Learning ; Mathematical analysis ; Mechatronics ; Natural Language Processing (NLP) ; Optimization ; Perturbation theory ; Robotics ; Simulation and Modeling</subject><ispartof>Machine learning, 2021-06, Vol.110 (6), p.1145-1173</ispartof><rights>The Author(s), under exclusive licence to Springer Science+Business Media LLC, part of Springer Nature 2021</rights><rights>The Author(s), under exclusive licence to Springer Science+Business Media LLC, part of Springer Nature 2021.</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c400t-992b984aabc52ad849cc917096efa9027197314e45ff33e7408140f24ff47a023</citedby><cites>FETCH-LOGICAL-c400t-992b984aabc52ad849cc917096efa9027197314e45ff33e7408140f24ff47a023</cites><orcidid>0000-0003-4133-7921</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,776,780,27901,27902</link.rule.ids></links><search><creatorcontrib>Liu, Fanghui</creatorcontrib><creatorcontrib>Shi, Lei</creatorcontrib><creatorcontrib>Huang, Xiaolin</creatorcontrib><creatorcontrib>Yang, Jie</creatorcontrib><creatorcontrib>Suykens, Johan A. K.</creatorcontrib><title>Analysis of regularized least-squares in reproducing kernel Kreĭn spaces</title><title>Machine learning</title><addtitle>Mach Learn</addtitle><description>In this paper, we study the asymptotic properties of regularized least squares with indefinite kernels in reproducing kernel Kreĭn spaces (RKKS). By introducing a bounded hyper-sphere constraint to such non-convex regularized risk minimization problem, we theoretically demonstrate that this problem has a globally optimal solution with a closed form on the sphere, which makes approximation analysis feasible in RKKS. Regarding to the original regularizer induced by the indefinite inner product, we modify traditional error decomposition techniques, prove convergence results for the introduced hypothesis error based on matrix perturbation theory, and derive learning rates of such regularized regression problem in RKKS. Under some conditions, the derived learning rates in RKKS are the same as that in reproducing kernel Hilbert spaces (RKHS). To the best of our knowledge, this is the first work on approximation analysis of regularized learning algorithms in RKKS.</description><subject>Algorithms</subject><subject>Approximation</subject><subject>Artificial Intelligence</subject><subject>Asymptotic properties</subject><subject>Computer Science</subject><subject>Control</subject><subject>Hilbert space</subject><subject>Kernels</subject><subject>Least squares</subject><subject>Machine Learning</subject><subject>Mathematical analysis</subject><subject>Mechatronics</subject><subject>Natural Language Processing (NLP)</subject><subject>Optimization</subject><subject>Perturbation theory</subject><subject>Robotics</subject><subject>Simulation and Modeling</subject><issn>0885-6125</issn><issn>1573-0565</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2021</creationdate><recordtype>article</recordtype><recordid>eNp9kEFOwzAQRS0EEqVwAVaRWBvGjh3Hy6oCWlGJDawtNx1XKSFpPc2i3IlbcDBcgsSO1Wg0_3_Nf4xdC7gVAOaOBFirOEjBQVutuTxhI6FNntZCn7IRlKXmhZD6nF0QbQBAFmUxYvNJ65sD1ZR1IYu47hsf6w9cZQ162nPa9T4iZXWbjtvYrfqqbtfZG8YWm-wp4tdnm9HWV0iX7Cz4hvDqd47Z68P9y3TGF8-P8-lkwSsFsOfWyqUtlffLSku_KpWtKisM2AKDtyCNsCYXCpUOIc_RKCiFgiBVCMp4kPmY3Qy56Z1dj7R3m66PqQU5qZXUuZRGJZUcVFXsiCIGt431u48HJ8AdkbkBmUvI3A8yd4zOBxMlcbvG-Bf9j-sbJgFuvg</recordid><startdate>20210601</startdate><enddate>20210601</enddate><creator>Liu, Fanghui</creator><creator>Shi, Lei</creator><creator>Huang, Xiaolin</creator><creator>Yang, Jie</creator><creator>Suykens, Johan A. K.</creator><general>Springer US</general><general>Springer Nature B.V</general><scope>AAYXX</scope><scope>CITATION</scope><scope>3V.</scope><scope>7SC</scope><scope>7XB</scope><scope>88I</scope><scope>8AL</scope><scope>8AO</scope><scope>8FD</scope><scope>8FE</scope><scope>8FG</scope><scope>8FK</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>GNUQQ</scope><scope>HCIFZ</scope><scope>JQ2</scope><scope>K7-</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>M0N</scope><scope>M2P</scope><scope>P5Z</scope><scope>P62</scope><scope>PHGZM</scope><scope>PHGZT</scope><scope>PKEHL</scope><scope>PQEST</scope><scope>PQGLB</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>Q9U</scope><orcidid>https://orcid.org/0000-0003-4133-7921</orcidid></search><sort><creationdate>20210601</creationdate><title>Analysis of regularized least-squares in reproducing kernel Kreĭn spaces</title><author>Liu, Fanghui ; Shi, Lei ; Huang, Xiaolin ; Yang, Jie ; Suykens, Johan A. K.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c400t-992b984aabc52ad849cc917096efa9027197314e45ff33e7408140f24ff47a023</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2021</creationdate><topic>Algorithms</topic><topic>Approximation</topic><topic>Artificial Intelligence</topic><topic>Asymptotic properties</topic><topic>Computer Science</topic><topic>Control</topic><topic>Hilbert space</topic><topic>Kernels</topic><topic>Least squares</topic><topic>Machine Learning</topic><topic>Mathematical analysis</topic><topic>Mechatronics</topic><topic>Natural Language Processing (NLP)</topic><topic>Optimization</topic><topic>Perturbation theory</topic><topic>Robotics</topic><topic>Simulation and Modeling</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Liu, Fanghui</creatorcontrib><creatorcontrib>Shi, Lei</creatorcontrib><creatorcontrib>Huang, Xiaolin</creatorcontrib><creatorcontrib>Yang, Jie</creatorcontrib><creatorcontrib>Suykens, Johan A. K.</creatorcontrib><collection>CrossRef</collection><collection>ProQuest Central (Corporate)</collection><collection>Computer and Information Systems Abstracts</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Science Database (Alumni Edition)</collection><collection>Computing Database (Alumni Edition)</collection><collection>ProQuest Pharma Collection</collection><collection>Technology Research Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central</collection><collection>Advanced Technologies & Aerospace Collection</collection><collection>ProQuest Central Essentials</collection><collection>AUTh Library subscriptions: ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central</collection><collection>ProQuest Central Student</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Computer Science Collection</collection><collection>Computer Science Database</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>Computing Database</collection><collection>Science Database</collection><collection>ProQuest advanced technologies & aerospace journals</collection><collection>ProQuest Advanced Technologies & Aerospace Collection</collection><collection>ProQuest Central (New)</collection><collection>ProQuest One Academic (New)</collection><collection>ProQuest One Academic Middle East (New)</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Applied & Life Sciences</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>ProQuest Central Basic</collection><jtitle>Machine learning</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Liu, Fanghui</au><au>Shi, Lei</au><au>Huang, Xiaolin</au><au>Yang, Jie</au><au>Suykens, Johan A. K.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Analysis of regularized least-squares in reproducing kernel Kreĭn spaces</atitle><jtitle>Machine learning</jtitle><stitle>Mach Learn</stitle><date>2021-06-01</date><risdate>2021</risdate><volume>110</volume><issue>6</issue><spage>1145</spage><epage>1173</epage><pages>1145-1173</pages><issn>0885-6125</issn><eissn>1573-0565</eissn><abstract>In this paper, we study the asymptotic properties of regularized least squares with indefinite kernels in reproducing kernel Kreĭn spaces (RKKS). By introducing a bounded hyper-sphere constraint to such non-convex regularized risk minimization problem, we theoretically demonstrate that this problem has a globally optimal solution with a closed form on the sphere, which makes approximation analysis feasible in RKKS. Regarding to the original regularizer induced by the indefinite inner product, we modify traditional error decomposition techniques, prove convergence results for the introduced hypothesis error based on matrix perturbation theory, and derive learning rates of such regularized regression problem in RKKS. Under some conditions, the derived learning rates in RKKS are the same as that in reproducing kernel Hilbert spaces (RKHS). To the best of our knowledge, this is the first work on approximation analysis of regularized learning algorithms in RKKS.</abstract><cop>New York</cop><pub>Springer US</pub><doi>10.1007/s10994-021-05955-2</doi><tpages>29</tpages><orcidid>https://orcid.org/0000-0003-4133-7921</orcidid><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 0885-6125 |
ispartof | Machine learning, 2021-06, Vol.110 (6), p.1145-1173 |
issn | 0885-6125 1573-0565 |
language | eng |
recordid | cdi_proquest_journals_2542532274 |
source | Springer Nature |
subjects | Algorithms Approximation Artificial Intelligence Asymptotic properties Computer Science Control Hilbert space Kernels Least squares Machine Learning Mathematical analysis Mechatronics Natural Language Processing (NLP) Optimization Perturbation theory Robotics Simulation and Modeling |
title | Analysis of regularized least-squares in reproducing kernel Kreĭn spaces |
url | http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-15T10%3A20%3A25IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Analysis%20of%20regularized%20least-squares%20in%20reproducing%20kernel%20Kre%C4%ADn%20spaces&rft.jtitle=Machine%20learning&rft.au=Liu,%20Fanghui&rft.date=2021-06-01&rft.volume=110&rft.issue=6&rft.spage=1145&rft.epage=1173&rft.pages=1145-1173&rft.issn=0885-6125&rft.eissn=1573-0565&rft_id=info:doi/10.1007/s10994-021-05955-2&rft_dat=%3Cproquest_cross%3E2542532274%3C/proquest_cross%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c400t-992b984aabc52ad849cc917096efa9027197314e45ff33e7408140f24ff47a023%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=2542532274&rft_id=info:pmid/&rfr_iscdi=true |