Colección SciELO Chile

Departamento Gestión de Conocimiento, Monitoreo y Prospección
Consultas o comentarios: productividad@anid.cl
Búsqueda Publicación
Búsqueda por Tema Título, Abstract y Keywords



Automated segmentation and classification of supraspinatus fatty infiltration in shoulder magnetic resonance image using a convolutional neural network
Indexado
WoS WOS:001313722700001
Scopus SCOPUS_ID:85204014920
DOI 10.3389/FMED.2024.1416169
Año 2024
Tipo artículo de investigación

Citas Totales

Autores Afiliación Chile

Instituciones Chile

% Participación
Internacional

Autores
Afiliación Extranjera

Instituciones
Extranjeras


Abstract



Background Goutallier's fatty infiltration of the supraspinatus muscle is a critical condition in degenerative shoulder disorders. Deep learning research primarily uses manual segmentation and labeling to detect this condition. Employing unsupervised training with a hybrid framework of segmentation and classification could offer an efficient solution.Aim To develop and assess a two-step deep learning model for detecting the region of interest and categorizing the magnetic resonance image (MRI) supraspinatus muscle fatty infiltration according to Goutallier's scale.Materials and methods A retrospective study was performed from January 1, 2019 to September 20, 2020, using 900 MRI T2-weighted images with supraspinatus muscle fatty infiltration diagnoses. A model with two sequential neural networks was implemented and trained. The first sub-model automatically detects the region of interest using a U-Net model. The second sub-model performs a binary classification using the VGG-19 architecture. The model's performance was computed as the average of five-fold cross-validation processes. Loss, accuracy, Dice coefficient (CI. 95%), AU-ROC, sensitivity, and specificity (CI. 95%) were reported.Results Six hundred and six shoulders MRIs were analyzed. The Goutallier distribution was presented as follows: 0 (66.50%); 1 (18.81%); 2 (8.42%); 3 (3.96%); 4 (2.31%). Segmentation results demonstrate high levels of accuracy (0.9977 +/- 0.0002) and Dice score (0.9441 +/- 0.0031), while the classification model also results in high levels of accuracy (0.9731 +/- 0.0230); sensitivity (0.9000 +/- 0.0980); specificity (0.9788 +/- 0.0257); and AUROC (0.9903 +/- 0.0092).Conclusion The two-step training method proposed using a deep learning model demonstrated strong performance in segmentation and classification tasks.

Revista



Revista ISSN
Frontiers In Medicine 2296-858X

Métricas Externas



PlumX Altmetric Dimensions

Muestra métricas de impacto externas asociadas a la publicación. Para mayor detalle:

Disciplinas de Investigación



WOS
Medicine, General & Internal
Scopus
Medicine (All)
SciELO
Sin Disciplinas

Muestra la distribución de disciplinas para esta publicación.

Publicaciones WoS (Ediciones: ISSHP, ISTP, AHCI, SSCI, SCI), Scopus, SciELO Chile.

Colaboración Institucional



Muestra la distribución de colaboración, tanto nacional como extranjera, generada en esta publicación.


Autores - Afiliación



Ord. Autor Género Institución - País
1 AITKEN-SAAVEDRA, JUAN PABLO Hombre Pontificia Universidad Católica de Valparaíso - Chile
2 Droppelmann, Guillermo Hombre Clin Meds - Chile
Harvard TH Chan Sch Publ Hlth - Estados Unidos
Clinica Meds, Chile - Chile
Harvard T.H. Chan School of Public Health - Estados Unidos
3 JORQUERA-AGUILERA, CARLOS ALBERTO Hombre Universidad Mayor - Chile
4 F, Feijoo Hombre Pontificia Universidad Católica de Valparaíso - Chile

Muestra la afiliación y género (detectado) para los co-autores de la publicación.

Financiamiento



Fuente
Universidad Mayor

Muestra la fuente de financiamiento declarada en la publicación.

Agradecimientos



Agradecimiento
The author(s) declare that financial support was received for the research, authorship, and/or publication of this article. The publication was financially supported by the Universidad Mayor.
The author(s) declare that financial support was received for the research, authorship, and/or publication of this article. The publication was financially supported by the Universidad Mayor.

Muestra la fuente de financiamiento declarada en la publicación.