Loading…

A novel prompt-tuning method: Incorporating scenario-specific concepts into a verbalizer

The verbalizer, which serves to map label words to class labels, is an essential component of prompt-tuning. In this paper, we present a novel approach to constructing verbalizers. While existing methods for verbalizer construction mainly rely on augmenting and refining sets of synonyms or related w...

Full description

Saved in:
Bibliographic Details
Published in:Expert systems with applications 2024-08, Vol.247, p.123204, Article 123204
Main Authors: Ma, Yong, Luo, Senlin, Shang, Yu-Ming, Li, Zhengjun, Liu, Yong
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:The verbalizer, which serves to map label words to class labels, is an essential component of prompt-tuning. In this paper, we present a novel approach to constructing verbalizers. While existing methods for verbalizer construction mainly rely on augmenting and refining sets of synonyms or related words based on class names, this paradigm suffers from a narrow perspective and lack of abstraction, resulting in limited coverage and high bias in the label-word space. To address this issue, we propose a label-word construction process that incorporates scenario-specific concepts. Specifically, we extract rich concepts from task-specific scenarios as label-word candidates and then develop a novel cascade calibration module to refine the candidates into a set of label words for each class. We evaluate the effectiveness of our proposed approach through extensive experiments on five widely used datasets for zero-shot text classification. The results demonstrate that our method outperforms existing methods and achieves state-of-the-art results. •Retrieving scenario-specific concepts as label word candidates.•Proposing a novel class-name-free cascade approach for label-word refining.•Constructing a verbalizer that has wide-ranging coverage and minimal bias.•Reporting fresh state-of-the-art results.
ISSN:0957-4174
1873-6793
DOI:10.1016/j.eswa.2024.123204