Loading…
Large language models in science
Large language models (LLMs) are gaining popularity due to their ability to communicate in a human-like manner. Their potential for science, including urology, is increasingly recognized. However, unresolved concerns regarding transparency, accountability, and the accuracy of LLM results still exist...
Saved in:
Published in: | Urologie (Heidelberg, Germany) Germany), 2024-09, Vol.63 (9), p.860 |
---|---|
Main Authors: | , |
Format: | Article |
Language: | ger |
Subjects: | |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Large language models (LLMs) are gaining popularity due to their ability to communicate in a human-like manner. Their potential for science, including urology, is increasingly recognized. However, unresolved concerns regarding transparency, accountability, and the accuracy of LLM results still exist.
This review examines the ethical, technical, and practical challenges as well as the potential applications of LLMs in urology and science.
A selective literature review was conducted to analyze current findings and developments in the field of LLMs. The review considered studies on technical aspects, ethical considerations, and practical applications in research and practice.
LLMs, such as GPT from OpenAI and Gemini from Google, show great potential for processing and analyzing text data. Applications in urology include creating patient information and supporting administrative tasks. However, for purely clinical and scientific questions, the methods do not yet seem mature. Currently, concerns about ethical issues and the accuracy of results persist.
LLMs have the potential to support research and practice through efficient data processing and information provision. Despite their advantages, ethical concerns and technical challenges must be addressed to ensure responsible and trustworthy use. Increased implementation could reduce the workload of urologists and improve communication with patients. |
---|---|
ISSN: | 2731-7072 2731-7072 |
DOI: | 10.1007/s00120-024-02396-2 |