Loading…

SWI and CTP fusion model based on sparse representation method to predict cerebral infarction trend

SWI image signal is related to venous reflux disorder and perfusion defect. Computed tomography perfusion (CTP) contains perfusion information in space and time. There is a complementary basis between them to affect the prognosis of cerebral infarction. Sixty-six patients included in the retrospecti...

Full description

Saved in:
Bibliographic Details
Published in:Frontiers in neuroscience 2024-06, Vol.18, p.1360459
Main Authors: Wu, Guoqing, Wang, Hao, Ma, Xiaojun, Li, Huanyin, Song, Bin, Zhao, Jing, Wang, Xin, Lin, Jixian
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:SWI image signal is related to venous reflux disorder and perfusion defect. Computed tomography perfusion (CTP) contains perfusion information in space and time. There is a complementary basis between them to affect the prognosis of cerebral infarction. Sixty-six patients included in the retrospective study were designated as the training set. Effective perfusion indicator features and imaging radiomic features of the peri-infarction area on Susceptibility weighted imaging (SWI) and CTP modality images were extracted from each case. Thirty-three patients from the prospectively included group were designated as the test set of the machine learning model based on a sparse representation method. The predicted results were compared with the DWI results of the patients' 7-10 days review to assess the validity and accuracy of the prediction. The AUC of the SWI + CTP integrated model was 0.952, the ACC was 0.909, the SEN was 0.889, and the SPE was 0.933. The prediction performance is the highest. Compared with the value of AUC: the SWI model is 0.874, inferior to the performance of the SWI + CTP model, and the CTP model is 0.715. The prediction efficiency of the changing trend of infarction volume is further improved by the correlation between the combination of the two image features.
ISSN:1662-4548
1662-453X
1662-453X
DOI:10.3389/fnins.2024.1360459