Loading…
HBNet: an integrated approach for resolving class imbalance and global local feature fusion for accurate breast cancer classification
Breast cancer, a widespread global disease, represents a significant threat to women’s health and lives. Many researchers have proposed computer-aided diagnosis systems for classifying breast cancer. The majority of the approaches primarily utilize deep learning methods, which overlook the crucial n...
Saved in:
Published in: | Neural computing & applications 2024-05, Vol.36 (15), p.8455-8472 |
---|---|
Main Authors: | , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Breast cancer, a widespread global disease, represents a significant threat to women’s health and lives. Many researchers have proposed computer-aided diagnosis systems for classifying breast cancer. The majority of the approaches primarily utilize deep learning methods, which overlook the crucial necessity of incorporating both local information for precise tumor detection. In addition, available breast cancer datasets are imbalanced in nature. Therefore, this paper presents the hybrid breast network (HBNet) for detection of breast cancer, designed to address two critical challenges: class imbalance and incorporation of both global and local information in achieving precise tumor classification. To overcome the problem of class imbalance, HBNet incorporates the borderline synthetic minority oversampling technique. Simultaneously, it employs a feature fusion approach to combine deep and handcrafted features extracted by utilizing ResNet50 and HOG which incorporates global and local information. Moreover, the proposed method integrates the block-matching and 3D denoising filter to effectively eliminate multiplicative noise that has enhanced the performance of the system. The proposed HBNet is evaluated with BUSI and UDIAT datasets and achieved an average accuracy of 95.824% and 90.37%, respectively. |
---|---|
ISSN: | 0941-0643 1433-3058 |
DOI: | 10.1007/s00521-024-09541-0 |