Loading…
Artificial intelligence and allergic rhinitis: does ChatGPT increase or impair the knowledge?
Abstract Background Optimal management of allergic rhinitis requires patient education with easy access to accurate information. However, previous online platforms have provided misleading information. The demand for online medical information continues to grow, especially with the introduction of a...
Saved in:
Published in: | Journal of public health (Oxford, England) England), 2024-02, Vol.46 (1), p.123-126 |
---|---|
Main Authors: | , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Abstract
Background
Optimal management of allergic rhinitis requires patient education with easy access to accurate information. However, previous online platforms have provided misleading information. The demand for online medical information continues to grow, especially with the introduction of advanced chatbots like ChatGPT.
Methods
This study aimed to evaluate the quality of information provided by ChatGPT regarding allergic rhinitis. A Likert scale was used to assess the accuracy of responses, ranging from 1 to 5. Four authors independently rated the responses from a healthcare professional’s perspective.
Results
A total of 20 questions covering various aspects of allergic rhinitis were asked. Among the answers, eight received a score of 5 (no inaccuracies), five received a score of 4 (minor non-harmful inaccuracies), six received a score of 3 (potentially misinterpretable inaccuracies) and one answer had a score of 2 (minor potentially harmful inaccuracies).
Conclusions
The variability in accuracy scores highlights the need for caution when relying solely on chatbots like ChatGPT for medical advice. Patients should consult qualified healthcare professionals and use online sources as a supplement. While ChatGPT has advantages in medical information delivery, its use should be approached with caution. ChatGPT can be useful for patient education but cannot replace healthcare professionals. |
---|---|
ISSN: | 1741-3842 1741-3850 1741-3850 |
DOI: | 10.1093/pubmed/fdad219 |