Loading…
A Dual Semismooth Newton Based Augmented Lagrangian Method for Large-Scale Linearly Constrained Sparse Group Square-Root Lasso Problems
Square-root Lasso problems have already be shown to be robust regression problems. Furthermore, square-root regression problems with structured sparsity also plays an important role in statistics and machine learning. In this paper, we focus on the numerical computation of large-scale linearly const...
Saved in:
Published in: | Journal of scientific computing 2023-08, Vol.96 (2), p.45, Article 45 |
---|---|
Main Authors: | , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
cited_by | |
---|---|
cites | cdi_FETCH-LOGICAL-c270t-e15fb435ab905f147fced40241449af9cc406556c5cd178f579572bc056ce6393 |
container_end_page | |
container_issue | 2 |
container_start_page | 45 |
container_title | Journal of scientific computing |
container_volume | 96 |
creator | Wang, Chengjing Tang, Peipei |
description | Square-root Lasso problems have already be shown to be robust regression problems. Furthermore, square-root regression problems with structured sparsity also plays an important role in statistics and machine learning. In this paper, we focus on the numerical computation of large-scale linearly constrained sparse group square-root Lasso problems. In order to overcome the difficulty that there are two nonsmooth terms in the objective function, we propose a dual semismooth Newton (SSN) based augmented Lagrangian method (ALM) for it. That is, we apply the ALM to the dual problem with the subproblem solved by the SSN method. To apply the SSN method, the positive definiteness of the generalized Jacobian is very important. Hence we characterize the equivalence of its positive definiteness and the constraint nondegeneracy condition of the corresponding primal problem. In numerical implementation, we fully employ the second order sparsity so that the Newton direction can be efficiently obtained. Numerical experiments demonstrate the efficiency of the proposed algorithm. |
doi_str_mv | 10.1007/s10915-023-02271-w |
format | article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2918316882</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2918316882</sourcerecordid><originalsourceid>FETCH-LOGICAL-c270t-e15fb435ab905f147fced40241449af9cc406556c5cd178f579572bc056ce6393</originalsourceid><addsrcrecordid>eNp9UMlOwzAQtRBIlOUHOFnibLAdO46PpUBBKosonC3XmaStkrjYiSq-gN_GUCRuHEYz8_QW6SF0xugFo1RdRkY1k4TyLA1XjGz30IhJlRGVa7aPRrQoJFFCiUN0FOOaUqoLzUfoc4yvB9vgObSr2HrfL_EjbHvf4SsbocTjoW6h69M1s3WwXb2yHX6AfulLXPmQ0FADmTvbAJ6tOrCh-cAT38U-2PSWeL6xIQKeBj9s8Px9sAHIS8pJyhg9fg5-0UAbT9BBZZsIp7_7GL3d3rxO7sjsaXo_Gc-I44r2BJisFiKTdqGprJhQlYNSUC6YENpW2jlBcylzJ13JVFFJpaXiC0cTBHmms2N0vvPdBP8-QOzN2g-hS5GGa1ZkLC8Knlh8x3LBxxigMpuwam34MIya78LNrnCTCjc_hZttEmU7UUzkrobwZ_2P6gtYTYTe</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2918316882</pqid></control><display><type>article</type><title>A Dual Semismooth Newton Based Augmented Lagrangian Method for Large-Scale Linearly Constrained Sparse Group Square-Root Lasso Problems</title><source>Springer Link</source><creator>Wang, Chengjing ; Tang, Peipei</creator><creatorcontrib>Wang, Chengjing ; Tang, Peipei</creatorcontrib><description>Square-root Lasso problems have already be shown to be robust regression problems. Furthermore, square-root regression problems with structured sparsity also plays an important role in statistics and machine learning. In this paper, we focus on the numerical computation of large-scale linearly constrained sparse group square-root Lasso problems. In order to overcome the difficulty that there are two nonsmooth terms in the objective function, we propose a dual semismooth Newton (SSN) based augmented Lagrangian method (ALM) for it. That is, we apply the ALM to the dual problem with the subproblem solved by the SSN method. To apply the SSN method, the positive definiteness of the generalized Jacobian is very important. Hence we characterize the equivalence of its positive definiteness and the constraint nondegeneracy condition of the corresponding primal problem. In numerical implementation, we fully employ the second order sparsity so that the Newton direction can be efficiently obtained. Numerical experiments demonstrate the efficiency of the proposed algorithm.</description><identifier>ISSN: 0885-7474</identifier><identifier>EISSN: 1573-7691</identifier><identifier>DOI: 10.1007/s10915-023-02271-w</identifier><language>eng</language><publisher>New York: Springer US</publisher><subject>Algorithms ; Computational Mathematics and Numerical Analysis ; Constraints ; Feature selection ; Machine learning ; Mathematical and Computational Engineering ; Mathematical and Computational Physics ; Mathematics ; Mathematics and Statistics ; Numerical analysis ; Robustness (mathematics) ; Special Issue on Machine Learning on Scientific Computing ; Theoretical</subject><ispartof>Journal of scientific computing, 2023-08, Vol.96 (2), p.45, Article 45</ispartof><rights>The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature 2023. Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><cites>FETCH-LOGICAL-c270t-e15fb435ab905f147fced40241449af9cc406556c5cd178f579572bc056ce6393</cites><orcidid>0000-0003-4137-8029</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,776,780,27903,27904</link.rule.ids></links><search><creatorcontrib>Wang, Chengjing</creatorcontrib><creatorcontrib>Tang, Peipei</creatorcontrib><title>A Dual Semismooth Newton Based Augmented Lagrangian Method for Large-Scale Linearly Constrained Sparse Group Square-Root Lasso Problems</title><title>Journal of scientific computing</title><addtitle>J Sci Comput</addtitle><description>Square-root Lasso problems have already be shown to be robust regression problems. Furthermore, square-root regression problems with structured sparsity also plays an important role in statistics and machine learning. In this paper, we focus on the numerical computation of large-scale linearly constrained sparse group square-root Lasso problems. In order to overcome the difficulty that there are two nonsmooth terms in the objective function, we propose a dual semismooth Newton (SSN) based augmented Lagrangian method (ALM) for it. That is, we apply the ALM to the dual problem with the subproblem solved by the SSN method. To apply the SSN method, the positive definiteness of the generalized Jacobian is very important. Hence we characterize the equivalence of its positive definiteness and the constraint nondegeneracy condition of the corresponding primal problem. In numerical implementation, we fully employ the second order sparsity so that the Newton direction can be efficiently obtained. Numerical experiments demonstrate the efficiency of the proposed algorithm.</description><subject>Algorithms</subject><subject>Computational Mathematics and Numerical Analysis</subject><subject>Constraints</subject><subject>Feature selection</subject><subject>Machine learning</subject><subject>Mathematical and Computational Engineering</subject><subject>Mathematical and Computational Physics</subject><subject>Mathematics</subject><subject>Mathematics and Statistics</subject><subject>Numerical analysis</subject><subject>Robustness (mathematics)</subject><subject>Special Issue on Machine Learning on Scientific Computing</subject><subject>Theoretical</subject><issn>0885-7474</issn><issn>1573-7691</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><recordid>eNp9UMlOwzAQtRBIlOUHOFnibLAdO46PpUBBKosonC3XmaStkrjYiSq-gN_GUCRuHEYz8_QW6SF0xugFo1RdRkY1k4TyLA1XjGz30IhJlRGVa7aPRrQoJFFCiUN0FOOaUqoLzUfoc4yvB9vgObSr2HrfL_EjbHvf4SsbocTjoW6h69M1s3WwXb2yHX6AfulLXPmQ0FADmTvbAJ6tOrCh-cAT38U-2PSWeL6xIQKeBj9s8Px9sAHIS8pJyhg9fg5-0UAbT9BBZZsIp7_7GL3d3rxO7sjsaXo_Gc-I44r2BJisFiKTdqGprJhQlYNSUC6YENpW2jlBcylzJ13JVFFJpaXiC0cTBHmms2N0vvPdBP8-QOzN2g-hS5GGa1ZkLC8Knlh8x3LBxxigMpuwam34MIya78LNrnCTCjc_hZttEmU7UUzkrobwZ_2P6gtYTYTe</recordid><startdate>20230801</startdate><enddate>20230801</enddate><creator>Wang, Chengjing</creator><creator>Tang, Peipei</creator><general>Springer US</general><general>Springer Nature B.V</general><scope>AAYXX</scope><scope>CITATION</scope><scope>8FE</scope><scope>8FG</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>GNUQQ</scope><scope>HCIFZ</scope><scope>JQ2</scope><scope>K7-</scope><scope>P5Z</scope><scope>P62</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><orcidid>https://orcid.org/0000-0003-4137-8029</orcidid></search><sort><creationdate>20230801</creationdate><title>A Dual Semismooth Newton Based Augmented Lagrangian Method for Large-Scale Linearly Constrained Sparse Group Square-Root Lasso Problems</title><author>Wang, Chengjing ; Tang, Peipei</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c270t-e15fb435ab905f147fced40241449af9cc406556c5cd178f579572bc056ce6393</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Algorithms</topic><topic>Computational Mathematics and Numerical Analysis</topic><topic>Constraints</topic><topic>Feature selection</topic><topic>Machine learning</topic><topic>Mathematical and Computational Engineering</topic><topic>Mathematical and Computational Physics</topic><topic>Mathematics</topic><topic>Mathematics and Statistics</topic><topic>Numerical analysis</topic><topic>Robustness (mathematics)</topic><topic>Special Issue on Machine Learning on Scientific Computing</topic><topic>Theoretical</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Wang, Chengjing</creatorcontrib><creatorcontrib>Tang, Peipei</creatorcontrib><collection>CrossRef</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>ProQuest Central</collection><collection>Advanced Technologies & Aerospace Database (1962 - current)</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central</collection><collection>ProQuest Central Student</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Computer Science Collection</collection><collection>Computer science database</collection><collection>Advanced Technologies & Aerospace Database</collection><collection>ProQuest Advanced Technologies & Aerospace Collection</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><jtitle>Journal of scientific computing</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Wang, Chengjing</au><au>Tang, Peipei</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>A Dual Semismooth Newton Based Augmented Lagrangian Method for Large-Scale Linearly Constrained Sparse Group Square-Root Lasso Problems</atitle><jtitle>Journal of scientific computing</jtitle><stitle>J Sci Comput</stitle><date>2023-08-01</date><risdate>2023</risdate><volume>96</volume><issue>2</issue><spage>45</spage><pages>45-</pages><artnum>45</artnum><issn>0885-7474</issn><eissn>1573-7691</eissn><abstract>Square-root Lasso problems have already be shown to be robust regression problems. Furthermore, square-root regression problems with structured sparsity also plays an important role in statistics and machine learning. In this paper, we focus on the numerical computation of large-scale linearly constrained sparse group square-root Lasso problems. In order to overcome the difficulty that there are two nonsmooth terms in the objective function, we propose a dual semismooth Newton (SSN) based augmented Lagrangian method (ALM) for it. That is, we apply the ALM to the dual problem with the subproblem solved by the SSN method. To apply the SSN method, the positive definiteness of the generalized Jacobian is very important. Hence we characterize the equivalence of its positive definiteness and the constraint nondegeneracy condition of the corresponding primal problem. In numerical implementation, we fully employ the second order sparsity so that the Newton direction can be efficiently obtained. Numerical experiments demonstrate the efficiency of the proposed algorithm.</abstract><cop>New York</cop><pub>Springer US</pub><doi>10.1007/s10915-023-02271-w</doi><orcidid>https://orcid.org/0000-0003-4137-8029</orcidid></addata></record> |
fulltext | fulltext |
identifier | ISSN: 0885-7474 |
ispartof | Journal of scientific computing, 2023-08, Vol.96 (2), p.45, Article 45 |
issn | 0885-7474 1573-7691 |
language | eng |
recordid | cdi_proquest_journals_2918316882 |
source | Springer Link |
subjects | Algorithms Computational Mathematics and Numerical Analysis Constraints Feature selection Machine learning Mathematical and Computational Engineering Mathematical and Computational Physics Mathematics Mathematics and Statistics Numerical analysis Robustness (mathematics) Special Issue on Machine Learning on Scientific Computing Theoretical |
title | A Dual Semismooth Newton Based Augmented Lagrangian Method for Large-Scale Linearly Constrained Sparse Group Square-Root Lasso Problems |
url | http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-23T13%3A09%3A19IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=A%20Dual%20Semismooth%20Newton%20Based%20Augmented%20Lagrangian%20Method%20for%20Large-Scale%20Linearly%20Constrained%20Sparse%20Group%20Square-Root%20Lasso%20Problems&rft.jtitle=Journal%20of%20scientific%20computing&rft.au=Wang,%20Chengjing&rft.date=2023-08-01&rft.volume=96&rft.issue=2&rft.spage=45&rft.pages=45-&rft.artnum=45&rft.issn=0885-7474&rft.eissn=1573-7691&rft_id=info:doi/10.1007/s10915-023-02271-w&rft_dat=%3Cproquest_cross%3E2918316882%3C/proquest_cross%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c270t-e15fb435ab905f147fced40241449af9cc406556c5cd178f579572bc056ce6393%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=2918316882&rft_id=info:pmid/&rfr_iscdi=true |