Loading…
CNN explanation methods for ordinal regression tasks
The use of Convolutional Neural Network (CNN) models for image classification tasks has gained significant popularity. However, the lack of interpretability in CNN models poses challenges for debugging and validation. To address this issue, various explanation methods have been developed to provide...
Saved in:
Published in: | Neurocomputing (Amsterdam) 2025-01, Vol.615, p.128878, Article 128878 |
---|---|
Main Authors: | , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | The use of Convolutional Neural Network (CNN) models for image classification tasks has gained significant popularity. However, the lack of interpretability in CNN models poses challenges for debugging and validation. To address this issue, various explanation methods have been developed to provide insights into CNN models. This paper focuses on the validity of these explanation methods for ordinal regression tasks, where the classes have a predefined order relationship. Different modifications are proposed for two explanation methods to exploit the ordinal relationships between classes: Grad-CAM based on Ordinal Binary Decomposition (GradOBD-CAM) and Ordinal Information Bottleneck Analysis (OIBA). The performance of these modified methods is compared to existing popular alternatives. Experimental results demonstrate that GradOBD-CAM outperforms other methods in terms of interpretability for three out of four datasets, while OIBA achieves superior performance compared to IBA.
•Addressing interpretability challenges in CNN models for ordinal regression.•Modification of explanation methods for ordinal regression tasks.•Superior performance of GradOBD-CAM for interpretability in 3 out of 4 datasets.•Enhanced interpretability with OIBA compared to IBA.•Evaluation using comprehensive ordinal degradation score metrics. |
---|---|
ISSN: | 0925-2312 |
DOI: | 10.1016/j.neucom.2024.128878 |