Mostrar el registro sencillo del ítem

dc.contributor.authorGarcia Atutxa, Igor
dc.contributor.authorMartínez Más, José
dc.contributor.authorBueno Crespo, Andrés
dc.contributor.authorVillanueva Flores, Francisca
dc.date.accessioned2025-12-09T09:00:01Z
dc.date.available2025-12-09T09:00:01Z
dc.date.issued2025-10-15
dc.identifier.citationGarcia-Atutxa, I., Martínez-Más, J., Bueno-Crespo, A., & Villanueva Flores, F. (2025). Early‑fusion hybrid CNN‑Transformer models for multiclass ovarian tumor ultrasound classification. Frontiers in Artificial Intelligence, 8, 1679310.es
dc.identifier.urihttp://hdl.handle.net/10952/10524
dc.description.abstractOvarian cancer remains the deadliest gynecologic malignancy, and transvaginal ultrasound (TVS), the first-line test, still suffers from limited specificity and operator dependence. We introduce a learned early-fusion (joint projection) hybrid that couples EfficientNet-B7 (local descriptors) with a Swin Transformer (hierarchical global context) to classify eight ovarian tumor categories from 2D TVS. Using the public, de-identified OTU-2D dataset (n = 1,469 images across eight histopathologic classes), we conducted patient-level, stratified 5-fold cross-validation repeated 10×. To address class imbalance while preventing leakage, training used train-only oversampling, ultrasound-aware augmentations, and strong regularization; validation/test folds were never resampled. The hybrid achieved AUC 0.9904, accuracy 92.13%, sensitivity 92.38%, and specificity 98.90%, outperforming single CNN or ViT baselines. A soft ensemble of the top hybrids further improved performance to AUC 0.991, accuracy 93.3%, sensitivity 93.6%, and specificity 99.0%. Beyond discrimination, we provide deployment-oriented evaluation: isotonic calibration yielded reliable probabilities, decision-curve analysis showed net clinical benefit across 5–20% risk thresholds, entropy-based uncertainty supported confidencebased triage, and Grad-CAM highlighted clinically salient regions. All metrics are reported with 95% bootstrap confidence intervals, and the evaluation protocol preserves real-world data distributions. Taken together, this work advances ovarian ultrasound AI from accuracy-only reporting to calibrated, explainable, and uncertainty-aware decision support, offering a reproducible reference framework for multiclass ovarian ultrasound and a clear path toward clinical integration and prospective validation.es
dc.language.isoenes
dc.rightsAttribution-NonCommercial-NoDerivatives 4.0 Internacional*
dc.rightsAttribution-NonCommercial-NoDerivatives 4.0 Internacional*
dc.rights.urihttp://creativecommons.org/licenses/by-nc-nd/4.0/*
dc.subjectOvarian canceres
dc.subjectUltrasound imaginges
dc.subjectDeep learninges
dc.subjectCNNes
dc.subjectVision transformeres
dc.subjectHybrid modeles
dc.subjectEarly diagnosises
dc.titleEarly-fusion hybrid CNN-transformer models for multiclass ovarian tumor ultrasound classificationes
dc.typejournal articlees
dc.rights.accessRightsopen accesses
dc.journal.titleFrontiers in Artificial Intelligencees
dc.volume.number8es
dc.description.disciplineIngeniería, Industria y Construcciónes
dc.identifier.doi10.3389/frai.2025.1679310es
dc.description.facultyEscuela Politécnicaes


Ficheros en el ítem

Este ítem aparece en la(s) siguiente(s) colección(ones)

Mostrar el registro sencillo del ítem

Attribution-NonCommercial-NoDerivatives 4.0 Internacional
Excepto si se señala otra cosa, la licencia del ítem se describe como Attribution-NonCommercial-NoDerivatives 4.0 Internacional