##plugins.themes.bootstrap3.article.main##

Rabiu Okanlawon

K Castaño

Abstract

Background: Breast cancer is one of the leading causes of death among women worldwide. Early diagnosis plays a crucial role in improving survival rates; however, the interpretability of deep learning models often poses a significant challenge in clinical implementation.


Objective: This study aims to develop a hybrid deep learning architecture that combines Convolutional Neural Networks (CNN) with ensemble learning, equipped with feature selection techniques based on mutual information and SHAP values to improve both accuracy and interpretability in breast cancer diagnosis from Fine Needle Aspirate (FNA) images.


Method: The FNA image dataset was processed through CNN-based feature extraction, followed by feature selection using mutual information and SHAP values. An ensemble model was built using stacking that combined CNN, Random Forest, and Gradient Boosting. Evaluation was performed using accuracy, F1-score, AUC, and significance tests against the baseline model.


Results: The proposed hybrid architecture achieved an accuracy of 98.4%, an F1-score of 0.983, and an AUC of 0.995, surpassing state-of-the-art approaches that only achieved an average accuracy of 96.2%. Interpretability analysis showed that features related to texture and cell nucleus morphology had the greatest contribution to model predictions.


Conclusion: This approach not only improves the performance of FNA-based breast cancer diagnosis but also provides interpretable results for medical professionals, potentially accelerating the clinical adoption of AI-based systems.

##plugins.themes.bootstrap3.article.details##