Loading…
Real-world validation of Artificial Intelligence-based Computed Tomography auto-contouring for prostate cancer radiotherapy planning
•Validation of commercial auto-contouring software for prostate cancer planning.•Variability of manual vs automatic contouring is lower than inter-observer variability.•Quick editing further improved performances, reducing inter-observer variability.•Contouring time was reduced from 17–24 min to 3–7...
Saved in:
Published in: | Physics and imaging in radiation oncology 2023-10, Vol.28, p.100501-100501, Article 100501 |
---|---|
Main Authors: | , , , , , , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | •Validation of commercial auto-contouring software for prostate cancer planning.•Variability of manual vs automatic contouring is lower than inter-observer variability.•Quick editing further improved performances, reducing inter-observer variability.•Contouring time was reduced from 17–24 min to 3–7 min.•An optimized and integrated automatic workflow was clinically implemented.
Artificial Intelligence (AI)-based auto-contouring for treatment planning in radiotherapy needs extensive clinical validation, including the impact of editing after automatic segmentation. The aims of this study were to assess the performance of a commercial system for Clinical Target Volumes (CTVs) (prostate/seminal vesicles) and selected Organs at Risk (OARs) (rectum/bladder/femoral heads + femurs), evaluating also inter-observer variability (manual vs automatic + editing) and the reduction of contouring time.
Two expert observers contoured CTVs/OARs of 20 patients in our Treatment Planning System (TPS). Computed Tomography (CT) images were sent to the automatic contouring workstation: automatic contours were generated and sent back to TPS, where observers could edit them if necessary. Inter- and intra-observer consistency was estimated using Dice Similarity Coefficients (DSC). Radiation oncologists were also asked to score the quality of automatic contours, ranging from 1 (complete re-contouring) to 5 (no editing). Contouring times (manual vs automatic + edit) were compared.
DSCs (manual vs automatic only) were consistent with inter-observer variability (between 0.65 for seminal vesicles and 0.94 for bladder); editing further improved performances (range: 0.76–0.94). The median clinical score was 4 (little editing) and it was |
---|---|
ISSN: | 2405-6316 2405-6316 |
DOI: | 10.1016/j.phro.2023.100501 |