Loading…

DELTA: Dynamic Embedding Learning with Truncated Conscious Attention for CTR Prediction

Click-Through Rate (CTR) prediction is a pivotal task in product and content recommendation, where learning effective feature embeddings is of great significance. However, traditional methods typically learn fixed feature representations without dynamically refining feature representations according...

Full description

Saved in:
Bibliographic Details
Published in:arXiv.org 2023-09
Main Authors: Chen, Zhu, Du, Liang, Chen, Hong, Zhao, Shuang, Sun, Zixun, Wang, Xin, Zhu, Wenwu
Format: Article
Language:English
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by
cites
container_end_page
container_issue
container_start_page
container_title arXiv.org
container_volume
creator Chen, Zhu
Du, Liang
Chen, Hong
Zhao, Shuang
Sun, Zixun
Wang, Xin
Zhu, Wenwu
description Click-Through Rate (CTR) prediction is a pivotal task in product and content recommendation, where learning effective feature embeddings is of great significance. However, traditional methods typically learn fixed feature representations without dynamically refining feature representations according to the context information, leading to suboptimal performance. Some recent approaches attempt to address this issue by learning bit-wise weights or augmented embeddings for feature representations, but suffer from uninformative or redundant features in the context. To tackle this problem, inspired by the Global Workspace Theory in conscious processing, which posits that only a specific subset of the product features are pertinent while the rest can be noisy and even detrimental to human-click behaviors, we propose a CTR model that enables Dynamic Embedding Learning with Truncated Conscious Attention for CTR prediction, termed DELTA. DELTA contains two key components: (I) conscious truncation module (CTM), which utilizes curriculum learning to apply adaptive truncation on attention weights to select the most critical feature in the context; (II) explicit embedding optimization (EEO), which applies an auxiliary task during training that directly and independently propagates the gradient from the loss layer to the embedding layer, thereby optimizing the embedding explicitly via linear feature crossing. Extensive experiments on five challenging CTR datasets demonstrate that DELTA achieves new state-of-art performance among current CTR methods.
format article
fullrecord <record><control><sourceid>proquest</sourceid><recordid>TN_cdi_proquest_journals_2811359343</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2811359343</sourcerecordid><originalsourceid>FETCH-proquest_journals_28113593433</originalsourceid><addsrcrecordid>eNqNitEKgjAYRkcQJOU7_NC1oJuWdSdqdNFFxKBLWdusSW61TaK3L6EH6Oo7nPNNUIAJSaI8xXiGQue6OI7xao2zjAToXNUHWmyhemvWKw51f5FCKH2Fg2RWj_BS_gbUDpozLwWURjuuzOCg8F5qr4yG1lgo6QmOVgrFR7VA05bdnQx_O0fLXU3LffSw5jlI55vODFZ_U4PzJCHZhqSE_Pf6ALE8QJU</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2811359343</pqid></control><display><type>article</type><title>DELTA: Dynamic Embedding Learning with Truncated Conscious Attention for CTR Prediction</title><source>Publicly Available Content Database</source><creator>Chen, Zhu ; Du, Liang ; Chen, Hong ; Zhao, Shuang ; Sun, Zixun ; Wang, Xin ; Zhu, Wenwu</creator><creatorcontrib>Chen, Zhu ; Du, Liang ; Chen, Hong ; Zhao, Shuang ; Sun, Zixun ; Wang, Xin ; Zhu, Wenwu</creatorcontrib><description>Click-Through Rate (CTR) prediction is a pivotal task in product and content recommendation, where learning effective feature embeddings is of great significance. However, traditional methods typically learn fixed feature representations without dynamically refining feature representations according to the context information, leading to suboptimal performance. Some recent approaches attempt to address this issue by learning bit-wise weights or augmented embeddings for feature representations, but suffer from uninformative or redundant features in the context. To tackle this problem, inspired by the Global Workspace Theory in conscious processing, which posits that only a specific subset of the product features are pertinent while the rest can be noisy and even detrimental to human-click behaviors, we propose a CTR model that enables Dynamic Embedding Learning with Truncated Conscious Attention for CTR prediction, termed DELTA. DELTA contains two key components: (I) conscious truncation module (CTM), which utilizes curriculum learning to apply adaptive truncation on attention weights to select the most critical feature in the context; (II) explicit embedding optimization (EEO), which applies an auxiliary task during training that directly and independently propagates the gradient from the loss layer to the embedding layer, thereby optimizing the embedding explicitly via linear feature crossing. Extensive experiments on five challenging CTR datasets demonstrate that DELTA achieves new state-of-art performance among current CTR methods.</description><identifier>EISSN: 2331-8422</identifier><language>eng</language><publisher>Ithaca: Cornell University Library, arXiv.org</publisher><subject>Attention ; Embedding ; Learning ; Modules ; Product specifications ; Recommender systems</subject><ispartof>arXiv.org, 2023-09</ispartof><rights>2023. This work is published under http://arxiv.org/licenses/nonexclusive-distrib/1.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://www.proquest.com/docview/2811359343?pq-origsite=primo$$EHTML$$P50$$Gproquest$$Hfree_for_read</linktohtml><link.rule.ids>776,780,25731,36989,44566</link.rule.ids></links><search><creatorcontrib>Chen, Zhu</creatorcontrib><creatorcontrib>Du, Liang</creatorcontrib><creatorcontrib>Chen, Hong</creatorcontrib><creatorcontrib>Zhao, Shuang</creatorcontrib><creatorcontrib>Sun, Zixun</creatorcontrib><creatorcontrib>Wang, Xin</creatorcontrib><creatorcontrib>Zhu, Wenwu</creatorcontrib><title>DELTA: Dynamic Embedding Learning with Truncated Conscious Attention for CTR Prediction</title><title>arXiv.org</title><description>Click-Through Rate (CTR) prediction is a pivotal task in product and content recommendation, where learning effective feature embeddings is of great significance. However, traditional methods typically learn fixed feature representations without dynamically refining feature representations according to the context information, leading to suboptimal performance. Some recent approaches attempt to address this issue by learning bit-wise weights or augmented embeddings for feature representations, but suffer from uninformative or redundant features in the context. To tackle this problem, inspired by the Global Workspace Theory in conscious processing, which posits that only a specific subset of the product features are pertinent while the rest can be noisy and even detrimental to human-click behaviors, we propose a CTR model that enables Dynamic Embedding Learning with Truncated Conscious Attention for CTR prediction, termed DELTA. DELTA contains two key components: (I) conscious truncation module (CTM), which utilizes curriculum learning to apply adaptive truncation on attention weights to select the most critical feature in the context; (II) explicit embedding optimization (EEO), which applies an auxiliary task during training that directly and independently propagates the gradient from the loss layer to the embedding layer, thereby optimizing the embedding explicitly via linear feature crossing. Extensive experiments on five challenging CTR datasets demonstrate that DELTA achieves new state-of-art performance among current CTR methods.</description><subject>Attention</subject><subject>Embedding</subject><subject>Learning</subject><subject>Modules</subject><subject>Product specifications</subject><subject>Recommender systems</subject><issn>2331-8422</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><sourceid>PIMPY</sourceid><recordid>eNqNitEKgjAYRkcQJOU7_NC1oJuWdSdqdNFFxKBLWdusSW61TaK3L6EH6Oo7nPNNUIAJSaI8xXiGQue6OI7xao2zjAToXNUHWmyhemvWKw51f5FCKH2Fg2RWj_BS_gbUDpozLwWURjuuzOCg8F5qr4yG1lgo6QmOVgrFR7VA05bdnQx_O0fLXU3LffSw5jlI55vODFZ_U4PzJCHZhqSE_Pf6ALE8QJU</recordid><startdate>20230905</startdate><enddate>20230905</enddate><creator>Chen, Zhu</creator><creator>Du, Liang</creator><creator>Chen, Hong</creator><creator>Zhao, Shuang</creator><creator>Sun, Zixun</creator><creator>Wang, Xin</creator><creator>Zhu, Wenwu</creator><general>Cornell University Library, arXiv.org</general><scope>8FE</scope><scope>8FG</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>HCIFZ</scope><scope>L6V</scope><scope>M7S</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PTHSS</scope></search><sort><creationdate>20230905</creationdate><title>DELTA: Dynamic Embedding Learning with Truncated Conscious Attention for CTR Prediction</title><author>Chen, Zhu ; Du, Liang ; Chen, Hong ; Zhao, Shuang ; Sun, Zixun ; Wang, Xin ; Zhu, Wenwu</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-proquest_journals_28113593433</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Attention</topic><topic>Embedding</topic><topic>Learning</topic><topic>Modules</topic><topic>Product specifications</topic><topic>Recommender systems</topic><toplevel>online_resources</toplevel><creatorcontrib>Chen, Zhu</creatorcontrib><creatorcontrib>Du, Liang</creatorcontrib><creatorcontrib>Chen, Hong</creatorcontrib><creatorcontrib>Zhao, Shuang</creatorcontrib><creatorcontrib>Sun, Zixun</creatorcontrib><creatorcontrib>Wang, Xin</creatorcontrib><creatorcontrib>Zhu, Wenwu</creatorcontrib><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>Materials Science &amp; Engineering Collection</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Engineering Collection</collection><collection>Engineering Database</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>Engineering Collection</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Chen, Zhu</au><au>Du, Liang</au><au>Chen, Hong</au><au>Zhao, Shuang</au><au>Sun, Zixun</au><au>Wang, Xin</au><au>Zhu, Wenwu</au><format>book</format><genre>document</genre><ristype>GEN</ristype><atitle>DELTA: Dynamic Embedding Learning with Truncated Conscious Attention for CTR Prediction</atitle><jtitle>arXiv.org</jtitle><date>2023-09-05</date><risdate>2023</risdate><eissn>2331-8422</eissn><abstract>Click-Through Rate (CTR) prediction is a pivotal task in product and content recommendation, where learning effective feature embeddings is of great significance. However, traditional methods typically learn fixed feature representations without dynamically refining feature representations according to the context information, leading to suboptimal performance. Some recent approaches attempt to address this issue by learning bit-wise weights or augmented embeddings for feature representations, but suffer from uninformative or redundant features in the context. To tackle this problem, inspired by the Global Workspace Theory in conscious processing, which posits that only a specific subset of the product features are pertinent while the rest can be noisy and even detrimental to human-click behaviors, we propose a CTR model that enables Dynamic Embedding Learning with Truncated Conscious Attention for CTR prediction, termed DELTA. DELTA contains two key components: (I) conscious truncation module (CTM), which utilizes curriculum learning to apply adaptive truncation on attention weights to select the most critical feature in the context; (II) explicit embedding optimization (EEO), which applies an auxiliary task during training that directly and independently propagates the gradient from the loss layer to the embedding layer, thereby optimizing the embedding explicitly via linear feature crossing. Extensive experiments on five challenging CTR datasets demonstrate that DELTA achieves new state-of-art performance among current CTR methods.</abstract><cop>Ithaca</cop><pub>Cornell University Library, arXiv.org</pub><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier EISSN: 2331-8422
ispartof arXiv.org, 2023-09
issn 2331-8422
language eng
recordid cdi_proquest_journals_2811359343
source Publicly Available Content Database
subjects Attention
Embedding
Learning
Modules
Product specifications
Recommender systems
title DELTA: Dynamic Embedding Learning with Truncated Conscious Attention for CTR Prediction
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-07T12%3A05%3A31IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=document&rft.atitle=DELTA:%20Dynamic%20Embedding%20Learning%20with%20Truncated%20Conscious%20Attention%20for%20CTR%20Prediction&rft.jtitle=arXiv.org&rft.au=Chen,%20Zhu&rft.date=2023-09-05&rft.eissn=2331-8422&rft_id=info:doi/&rft_dat=%3Cproquest%3E2811359343%3C/proquest%3E%3Cgrp_id%3Ecdi_FETCH-proquest_journals_28113593433%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=2811359343&rft_id=info:pmid/&rfr_iscdi=true