Loading…
Two‐view attention‐guided convolutional neural network for mammographic image classification
Deep learning has been widely used in the field of mammographic image classification owing to its superiority in automatic feature extraction. However, general deep learning models cannot achieve very satisfactory classification results on mammographic images because these models are not specificall...
Saved in:
Published in: | CAAI Transactions on Intelligence Technology 2023-06, Vol.8 (2), p.453-467 |
---|---|
Main Authors: | , , , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Deep learning has been widely used in the field of mammographic image classification owing to its superiority in automatic feature extraction. However, general deep learning models cannot achieve very satisfactory classification results on mammographic images because these models are not specifically designed for mammographic images and do not take the specific traits of these images into account. To exploit the essential discriminant information of mammographic images, we propose a novel classification method based on a convolutional neural network. Specifically, the proposed method designs two branches to extract the discriminative features from mammographic images from the mediolateral oblique and craniocaudal (CC) mammographic views. The features extracted from the two‐view mammographic images contain complementary information that enables breast cancer to be more easily distinguished. Moreover, the attention block is introduced to capture the channel‐wise information by adjusting the weight of each feature map, which is beneficial to emphasising the important features of mammographic images. Furthermore, we add a penalty term based on the fuzzy cluster algorithm to the cross‐entropy function, which improves the generalisation ability of the classification model by maximising the interclass distance and minimising the intraclass distance of the samples. The experimental results on The Digital database for Screening Mammography INbreast and MIAS mammography databases illustrate that the proposed method achieves the best classification performance and is more robust than the compared state‐of‐the‐art classification methods. |
---|---|
ISSN: | 2468-2322 2468-2322 |
DOI: | 10.1049/cit2.12096 |