Loading…

The Bayesian Lasso

The Lasso estimate for linear regression parameters can be interpreted as a Bayesian posterior mode estimate when the regression parameters have independent Laplace (i.e., double-exponential) priors. Gibbs sampling from this posterior is possible using an expanded hierarchy with conjugate normal pri...

Full description

Saved in:
Bibliographic Details
Published in:Journal of the American Statistical Association 2008-06, Vol.103 (482), p.681-686
Main Authors: Park, Trevor, Casella, George
Format: Article
Language:English
Subjects:
Citations: Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by cdi_FETCH-LOGICAL-c501t-227e89a8e6a8877e15d4393c7796b8a7f7fcf03ed6c1dfd73e7543b2e7f43b793
cites
container_end_page 686
container_issue 482
container_start_page 681
container_title Journal of the American Statistical Association
container_volume 103
creator Park, Trevor
Casella, George
description The Lasso estimate for linear regression parameters can be interpreted as a Bayesian posterior mode estimate when the regression parameters have independent Laplace (i.e., double-exponential) priors. Gibbs sampling from this posterior is possible using an expanded hierarchy with conjugate normal priors for the regression parameters and independent exponential priors on their variances. A connection with the inverse-Gaussian distribution provides tractable full conditional distributions. The Bayesian Lasso provides interval estimates (Bayesian credible intervals) that can guide variable selection. Moreover, the structure of the hierarchical model provides both Bayesian and likelihood methods for selecting the Lasso parameter. Slight modifications lead to Bayesian versions of other Lasso-related estimation methods, including bridge regression and a robust variant.
doi_str_mv 10.1198/016214508000000337
format article
fullrecord <record><control><sourceid>jstor_pasca</sourceid><recordid>TN_cdi_pascalfrancis_primary_20528553</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><jstor_id>27640090</jstor_id><sourcerecordid>27640090</sourcerecordid><originalsourceid>FETCH-LOGICAL-c501t-227e89a8e6a8877e15d4393c7796b8a7f7fcf03ed6c1dfd73e7543b2e7f43b793</originalsourceid><addsrcrecordid>eNp9kM1LAzEQxYMoWKsXj4JQFL2t5nMnOXjQ4hcUvFTwtqTZBLdsN5pskf73Zt2qoOhc3mF-83hvEDog-IwQJc8xySnhAkv8MYzBBhoQwSCjwJ820aADskSobbQT47yDQMoB2p8-29GVXtlY6WY00TH6XbTldB3t3lqH6PHmejq-yyYPt_fjy0lmBCZtRilYqbS0uZYSwBJRcqaYAVD5TGpw4IzDzJa5IaUrgVkQnM2oBZcEFBui0973JfjXpY1tsaiisXWtG-uXsWC5yjkhPIFHP8C5X4YmZStSO8mApsZDdPwnxBPBcc5ZomhPmeBjDNYVL6Fa6LAqCC66Txa_P5mOTtbWOhpdu6AbU8WvS4oFlUJ05oc9N4-tD997yDnGqot40e-rxvmw0G8-1GXR6lXtw6cp-yfHOy7tiro</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2472040643</pqid></control><display><type>article</type><title>The Bayesian Lasso</title><source>International Bibliography of the Social Sciences (IBSS)</source><source>JSTOR Archival Journals and Primary Sources Collection</source><source>Taylor and Francis Science and Technology Collection</source><creator>Park, Trevor ; Casella, George</creator><creatorcontrib>Park, Trevor ; Casella, George</creatorcontrib><description>The Lasso estimate for linear regression parameters can be interpreted as a Bayesian posterior mode estimate when the regression parameters have independent Laplace (i.e., double-exponential) priors. Gibbs sampling from this posterior is possible using an expanded hierarchy with conjugate normal priors for the regression parameters and independent exponential priors on their variances. A connection with the inverse-Gaussian distribution provides tractable full conditional distributions. The Bayesian Lasso provides interval estimates (Bayesian credible intervals) that can guide variable selection. Moreover, the structure of the hierarchical model provides both Bayesian and likelihood methods for selecting the Lasso parameter. Slight modifications lead to Bayesian versions of other Lasso-related estimation methods, including bridge regression and a robust variant.</description><identifier>ISSN: 0162-1459</identifier><identifier>EISSN: 1537-274X</identifier><identifier>DOI: 10.1198/016214508000000337</identifier><identifier>CODEN: JSTNAL</identifier><language>eng</language><publisher>Alexandria, VA: Taylor &amp; Francis</publisher><subject>Algorithms ; Analytical estimating ; Applications ; Bayes estimators ; Bayesian analysis ; Bayesian method ; Diabetes ; Empirical Bayes ; Estimation methods ; Exact sciences and technology ; General topics ; Gibbs sampler ; Hierarchical model ; Hierarchies ; Inverse Gaussian ; Least squares ; Linear inference, regression ; Linear models ; Linear regression ; Logic and foundations ; Mathematical logic, foundations, set theory ; Mathematics ; Maximum likelihood estimation ; Musical intervals ; Normal distribution ; Parameter estimation ; Parameter modification ; Penalized regression ; Probability and statistics ; Recursion theory ; Regression ; Regression analysis ; Robustness (mathematics) ; Scale mixture of normals ; Sciences and techniques of general use ; Statistical analysis ; Statistical discrepancies ; Statistical median ; Statistical methods ; Statistics ; Structural hierarchy ; Theory and Methods</subject><ispartof>Journal of the American Statistical Association, 2008-06, Vol.103 (482), p.681-686</ispartof><rights>American Statistical Association 2008</rights><rights>Copyright 2008 American Statistical Association</rights><rights>2008 INIST-CNRS</rights><rights>American Statistical Association. 2008</rights><rights>Copyright American Statistical Association Jun 2008</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c501t-227e89a8e6a8877e15d4393c7796b8a7f7fcf03ed6c1dfd73e7543b2e7f43b793</citedby></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://www.jstor.org/stable/pdf/27640090$$EPDF$$P50$$Gjstor$$H</linktopdf><linktohtml>$$Uhttps://www.jstor.org/stable/27640090$$EHTML$$P50$$Gjstor$$H</linktohtml><link.rule.ids>314,780,784,27924,27925,33223,33224,58238,58471</link.rule.ids><backlink>$$Uhttp://pascal-francis.inist.fr/vibad/index.php?action=getRecordDetail&amp;idt=20528553$$DView record in Pascal Francis$$Hfree_for_read</backlink></links><search><creatorcontrib>Park, Trevor</creatorcontrib><creatorcontrib>Casella, George</creatorcontrib><title>The Bayesian Lasso</title><title>Journal of the American Statistical Association</title><description>The Lasso estimate for linear regression parameters can be interpreted as a Bayesian posterior mode estimate when the regression parameters have independent Laplace (i.e., double-exponential) priors. Gibbs sampling from this posterior is possible using an expanded hierarchy with conjugate normal priors for the regression parameters and independent exponential priors on their variances. A connection with the inverse-Gaussian distribution provides tractable full conditional distributions. The Bayesian Lasso provides interval estimates (Bayesian credible intervals) that can guide variable selection. Moreover, the structure of the hierarchical model provides both Bayesian and likelihood methods for selecting the Lasso parameter. Slight modifications lead to Bayesian versions of other Lasso-related estimation methods, including bridge regression and a robust variant.</description><subject>Algorithms</subject><subject>Analytical estimating</subject><subject>Applications</subject><subject>Bayes estimators</subject><subject>Bayesian analysis</subject><subject>Bayesian method</subject><subject>Diabetes</subject><subject>Empirical Bayes</subject><subject>Estimation methods</subject><subject>Exact sciences and technology</subject><subject>General topics</subject><subject>Gibbs sampler</subject><subject>Hierarchical model</subject><subject>Hierarchies</subject><subject>Inverse Gaussian</subject><subject>Least squares</subject><subject>Linear inference, regression</subject><subject>Linear models</subject><subject>Linear regression</subject><subject>Logic and foundations</subject><subject>Mathematical logic, foundations, set theory</subject><subject>Mathematics</subject><subject>Maximum likelihood estimation</subject><subject>Musical intervals</subject><subject>Normal distribution</subject><subject>Parameter estimation</subject><subject>Parameter modification</subject><subject>Penalized regression</subject><subject>Probability and statistics</subject><subject>Recursion theory</subject><subject>Regression</subject><subject>Regression analysis</subject><subject>Robustness (mathematics)</subject><subject>Scale mixture of normals</subject><subject>Sciences and techniques of general use</subject><subject>Statistical analysis</subject><subject>Statistical discrepancies</subject><subject>Statistical median</subject><subject>Statistical methods</subject><subject>Statistics</subject><subject>Structural hierarchy</subject><subject>Theory and Methods</subject><issn>0162-1459</issn><issn>1537-274X</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2008</creationdate><recordtype>article</recordtype><sourceid>8BJ</sourceid><recordid>eNp9kM1LAzEQxYMoWKsXj4JQFL2t5nMnOXjQ4hcUvFTwtqTZBLdsN5pskf73Zt2qoOhc3mF-83hvEDog-IwQJc8xySnhAkv8MYzBBhoQwSCjwJ820aADskSobbQT47yDQMoB2p8-29GVXtlY6WY00TH6XbTldB3t3lqH6PHmejq-yyYPt_fjy0lmBCZtRilYqbS0uZYSwBJRcqaYAVD5TGpw4IzDzJa5IaUrgVkQnM2oBZcEFBui0973JfjXpY1tsaiisXWtG-uXsWC5yjkhPIFHP8C5X4YmZStSO8mApsZDdPwnxBPBcc5ZomhPmeBjDNYVL6Fa6LAqCC66Txa_P5mOTtbWOhpdu6AbU8WvS4oFlUJ05oc9N4-tD997yDnGqot40e-rxvmw0G8-1GXR6lXtw6cp-yfHOy7tiro</recordid><startdate>20080601</startdate><enddate>20080601</enddate><creator>Park, Trevor</creator><creator>Casella, George</creator><general>Taylor &amp; Francis</general><general>American Statistical Association</general><general>Taylor &amp; Francis Ltd</general><scope>IQODW</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>8BJ</scope><scope>FQK</scope><scope>JBE</scope><scope>K9.</scope></search><sort><creationdate>20080601</creationdate><title>The Bayesian Lasso</title><author>Park, Trevor ; Casella, George</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c501t-227e89a8e6a8877e15d4393c7796b8a7f7fcf03ed6c1dfd73e7543b2e7f43b793</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2008</creationdate><topic>Algorithms</topic><topic>Analytical estimating</topic><topic>Applications</topic><topic>Bayes estimators</topic><topic>Bayesian analysis</topic><topic>Bayesian method</topic><topic>Diabetes</topic><topic>Empirical Bayes</topic><topic>Estimation methods</topic><topic>Exact sciences and technology</topic><topic>General topics</topic><topic>Gibbs sampler</topic><topic>Hierarchical model</topic><topic>Hierarchies</topic><topic>Inverse Gaussian</topic><topic>Least squares</topic><topic>Linear inference, regression</topic><topic>Linear models</topic><topic>Linear regression</topic><topic>Logic and foundations</topic><topic>Mathematical logic, foundations, set theory</topic><topic>Mathematics</topic><topic>Maximum likelihood estimation</topic><topic>Musical intervals</topic><topic>Normal distribution</topic><topic>Parameter estimation</topic><topic>Parameter modification</topic><topic>Penalized regression</topic><topic>Probability and statistics</topic><topic>Recursion theory</topic><topic>Regression</topic><topic>Regression analysis</topic><topic>Robustness (mathematics)</topic><topic>Scale mixture of normals</topic><topic>Sciences and techniques of general use</topic><topic>Statistical analysis</topic><topic>Statistical discrepancies</topic><topic>Statistical median</topic><topic>Statistical methods</topic><topic>Statistics</topic><topic>Structural hierarchy</topic><topic>Theory and Methods</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Park, Trevor</creatorcontrib><creatorcontrib>Casella, George</creatorcontrib><collection>Pascal-Francis</collection><collection>CrossRef</collection><collection>International Bibliography of the Social Sciences (IBSS)</collection><collection>International Bibliography of the Social Sciences</collection><collection>International Bibliography of the Social Sciences</collection><collection>ProQuest Health &amp; Medical Complete (Alumni)</collection><jtitle>Journal of the American Statistical Association</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Park, Trevor</au><au>Casella, George</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>The Bayesian Lasso</atitle><jtitle>Journal of the American Statistical Association</jtitle><date>2008-06-01</date><risdate>2008</risdate><volume>103</volume><issue>482</issue><spage>681</spage><epage>686</epage><pages>681-686</pages><issn>0162-1459</issn><eissn>1537-274X</eissn><coden>JSTNAL</coden><abstract>The Lasso estimate for linear regression parameters can be interpreted as a Bayesian posterior mode estimate when the regression parameters have independent Laplace (i.e., double-exponential) priors. Gibbs sampling from this posterior is possible using an expanded hierarchy with conjugate normal priors for the regression parameters and independent exponential priors on their variances. A connection with the inverse-Gaussian distribution provides tractable full conditional distributions. The Bayesian Lasso provides interval estimates (Bayesian credible intervals) that can guide variable selection. Moreover, the structure of the hierarchical model provides both Bayesian and likelihood methods for selecting the Lasso parameter. Slight modifications lead to Bayesian versions of other Lasso-related estimation methods, including bridge regression and a robust variant.</abstract><cop>Alexandria, VA</cop><pub>Taylor &amp; Francis</pub><doi>10.1198/016214508000000337</doi><tpages>6</tpages></addata></record>
fulltext fulltext
identifier ISSN: 0162-1459
ispartof Journal of the American Statistical Association, 2008-06, Vol.103 (482), p.681-686
issn 0162-1459
1537-274X
language eng
recordid cdi_pascalfrancis_primary_20528553
source International Bibliography of the Social Sciences (IBSS); JSTOR Archival Journals and Primary Sources Collection; Taylor and Francis Science and Technology Collection
subjects Algorithms
Analytical estimating
Applications
Bayes estimators
Bayesian analysis
Bayesian method
Diabetes
Empirical Bayes
Estimation methods
Exact sciences and technology
General topics
Gibbs sampler
Hierarchical model
Hierarchies
Inverse Gaussian
Least squares
Linear inference, regression
Linear models
Linear regression
Logic and foundations
Mathematical logic, foundations, set theory
Mathematics
Maximum likelihood estimation
Musical intervals
Normal distribution
Parameter estimation
Parameter modification
Penalized regression
Probability and statistics
Recursion theory
Regression
Regression analysis
Robustness (mathematics)
Scale mixture of normals
Sciences and techniques of general use
Statistical analysis
Statistical discrepancies
Statistical median
Statistical methods
Statistics
Structural hierarchy
Theory and Methods
title The Bayesian Lasso
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-28T17%3A49%3A34IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-jstor_pasca&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=The%20Bayesian%20Lasso&rft.jtitle=Journal%20of%20the%20American%20Statistical%20Association&rft.au=Park,%20Trevor&rft.date=2008-06-01&rft.volume=103&rft.issue=482&rft.spage=681&rft.epage=686&rft.pages=681-686&rft.issn=0162-1459&rft.eissn=1537-274X&rft.coden=JSTNAL&rft_id=info:doi/10.1198/016214508000000337&rft_dat=%3Cjstor_pasca%3E27640090%3C/jstor_pasca%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c501t-227e89a8e6a8877e15d4393c7796b8a7f7fcf03ed6c1dfd73e7543b2e7f43b793%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=2472040643&rft_id=info:pmid/&rft_jstor_id=27640090&rfr_iscdi=true