Loading…

Interpretable deep learning for the prognosis of long-term functional outcome post-stroke using acute diffusion weighted imaging

Advances in deep learning can be applied to acute stroke imaging to build powerful and explainable prediction models that could supersede traditionally used biomarkers. We aimed to evaluate the performance and interpretability of a deep learning model based on convolutional neural networks (CNN) in...

Full description

Saved in:
Bibliographic Details
Published in:Journal of cerebral blood flow and metabolism 2023-02, Vol.43 (2), p.198-209
Main Authors: Moulton, Eric, Valabregue, Romain, Piotin, Michel, Marnat, Gaultier, Saleme, Suzana, Lapergue, Bertrand, Lehericy, Stephane, Clarencon, Frederic, Rosso, Charlotte
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Advances in deep learning can be applied to acute stroke imaging to build powerful and explainable prediction models that could supersede traditionally used biomarkers. We aimed to evaluate the performance and interpretability of a deep learning model based on convolutional neural networks (CNN) in predicting long-term functional outcome with diffusion-weighted imaging (DWI) acquired at day 1 post-stroke. Ischemic stroke patients (n = 322) were included from the ASTER and INSULINFARCT trials as well as the Pitié-Salpêtrière registry. We trained a CNN to predict long-term functional outcome assessed at 3 months with the modified Rankin Scale (dichotomized as good [mRS ≤ 2] vs. poor [mRS ≥ 3]) and compared its performance to two logistic regression models using lesion volume and ASPECTS. The CNN contained an attention mechanism, which allowed to visualize the areas of the brain that drove prediction. The deep learning model yielded a significantly higher area under the curve (0.83 95%CI [0.78–0.87]) than lesion volume (0.78 [0.73–0.83]) and ASPECTS (0.77 [0.71–0.83]) (p 
ISSN:0271-678X
1559-7016
DOI:10.1177/0271678X221129230