Loading…
Question-aware transformer models for consumer health question summarization
[Display omitted] •We propose a new method for consumer-health question summarization, by introducing various Cloze tasks to pretrained transformer models for a better coverage of the question focus in the summarized questions.•We also introduce explicit and implicit ways of infusing the knowledge o...
Saved in:
Published in: | Journal of biomedical informatics 2022-04, Vol.128, p.104040-104040, Article 104040 |
---|---|
Main Authors: | , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | [Display omitted]
•We propose a new method for consumer-health question summarization, by introducing various Cloze tasks to pretrained transformer models for a better coverage of the question focus in the summarized questions.•We also introduce explicit and implicit ways of infusing the knowledge of the question type in pretrained transformer models for the generation of informative and question-type driven summaries.•Our proposed model achieves state-of-the-art performance in the consumer health question summarization task and outperforms the recent state-of-theart pretrained LM models (e.g. T5, PEGASUS, and BART).•We conduct an automatic evaluation on the MeQSum benchmark and a manual evaluation to assess the correctness of the generated question summaries
Searching for health information online is becoming customary for more and more consumers every day, which makes the need for efficient and reliable question answering systems more pressing. An important contributor to the success rates of these systems is their ability to fully understand the consumers’ questions. However, these questions are frequently longer than needed and mention peripheral information that is not useful in finding relevant answers. Question summarization is one of the potential solutions to simplifying long and complex consumer questions before attempting to find an answer. In this paper, we study the task of abstractive summarization for real-world consumer health questions. We develop an abstractive question summarization model that leverages the semantic interpretation of a question via recognition of medical entities, which enables generation of informative summaries. Towards this, we propose multiple Cloze tasks (i.e. the task of filing missing words in a given context) to identify the key medical entities that enforce the model to have better coverage in question-focus recognition. Additionally, we infuse the decoder inputs with question-type information to generate question-type driven summaries. When evaluated on the MeQSum benchmark corpus, our framework outperformed the state-of-the-art method by 10.2 ROUGE-L points. We also conducted a manual evaluation to assess the correctness of the generated summaries. |
---|---|
ISSN: | 1532-0464 1532-0480 |
DOI: | 10.1016/j.jbi.2022.104040 |