Publicación: Deep Learning for Describing Breast Ultrasound Images with BI-RADS Terms
Cargando...
Fecha
2024
Autores
Parras Jurado, Manuela
Nogales, Alberto
Editor/a
Director/a
Tutor/a
Coordinador/a
Prologuista
Revisor/a
Ilustrador/a
Derechos de acceso
info:eu-repo/semantics/openAccess
Título de la revista
ISSN de la revista
Título del volumen
Editor
Springer
Resumen
Breast cancer is the most common cancer in women. Ultrasound is one of the most used techniques for diagnosis, but an expert in the field is necessary to interpret the test. Computer-aided diagnosis (CAD) systems aim to help physicians during this process. Experts use the Breast Imaging-Reporting and Data System (BI-RADS) to describe tumors according to several features (shape, margin, orientation...) and estimate their malignancy, with a common language. To aid in tumor diagnosis with BI-RADS explanations, this paper presents a deep neural network for tumor detection, description, and classification. An expert radiologist described with BI-RADS terms 749 nodules taken from public datasets. The YOLO detection algorithm is used to obtain Regions of Interest (ROIs), and then a model, based on a multi-class classification architecture, receives as input each ROI and outputs the BI-RADS descriptors, the BI-RADS classification (with 6 categories), and a Boolean classification of malignancy. Six hundred of the nodules were used for 10-fold cross-validation (CV) and 149 for testing. The accuracy of this model was compared with state-of-the-art CNNs for the same task. This model outperforms plain classifiers in the agreement with the expert (Cohen’s kappa), with a mean over the descriptors of 0.58 in CV and 0.64 in testing, while the second best model yielded kappas of 0.55 and 0.59, respectively. Adding YOLO to the model significantly enhances the performance (0.16 in CV and 0.09 in testing). More importantly, training the model with BI-RADS descriptors enables the explainability of the Boolean malignancy classification without reducing accuracy.
Descripción
This is the Accepted Manuscript of an article published by Springer in "The Journal of Imaging Informatics in Medicine" 2024, available online: https://doi.org/10.1007/s10278-024-01155-1
Este es el manuscrito aceptado de un artículo publicado por Springer in "The Journal of Imaging Informatics in Medicine" 2024, disponible en línea: https://doi.org/10.1007/s10278-024-01155-1
Categorías UNESCO
Palabras clave
breast ultrasound, BI-RADS, medical image captioning, computer aided diagnosis, attention mechanisms, explainable artificial intelligence
Citación
Carrilero-Mardones, M., Parras-Jurado, M., Nogales, A. et al. Deep Learning for Describing Breast Ultrasound Images with BI-RADS Terms. J Digit Imaging. Inform. med. 37, 2940–2954 (2024). https://doi.org/10.1007/s10278-024-01155-1
Centro
Facultades y escuelas::E.T.S. de Ingeniería Informática
Departamento
Inteligencia Artificial