Your browser doesn't support javascript.
loading
: 20 | 50 | 100
1 - 2 de 2
1.
Sci Rep ; 14(1): 10569, 2024 05 08.
Article En | MEDLINE | ID: mdl-38719918

Within the medical field of human assisted reproductive technology, a method for interpretable, non-invasive, and objective oocyte evaluation is lacking. To address this clinical gap, a workflow utilizing machine learning techniques has been developed involving automatic multi-class segmentation of two-dimensional images, morphometric analysis, and prediction of developmental outcomes of mature denuded oocytes based on feature extraction and clinical variables. Two separate models have been developed for this purpose-a model to perform multiclass segmentation, and a classifier model to classify oocytes as likely or unlikely to develop into a blastocyst (Day 5-7 embryo). The segmentation model is highly accurate at segmenting the oocyte, ensuring high-quality segmented images (masks) are utilized as inputs for the classifier model (mask model). The mask model displayed an area under the curve (AUC) of 0.63, a sensitivity of 0.51, and a specificity of 0.66 on the test set. The AUC underwent a reduction to 0.57 when features extracted from the ooplasm were removed, suggesting the ooplasm holds the information most pertinent to oocyte developmental competence. The mask model was further compared to a deep learning model, which also utilized the segmented images as inputs. The performance of both models combined in an ensemble model was evaluated, showing an improvement (AUC 0.67) compared to either model alone. The results of this study indicate that direct assessments of the oocyte are warranted, providing the first objective insights into key features for developmental competence, a step above the current standard of care-solely utilizing oocyte age as a proxy for quality.


Blastocyst , Machine Learning , Oocytes , Humans , Blastocyst/cytology , Blastocyst/physiology , Oocytes/cytology , Female , Embryonic Development , Adult , Fertilization in Vitro/methods , Image Processing, Computer-Assisted/methods
2.
Reprod Biomed Online ; 48(6): 103842, 2024 Jan 18.
Article En | MEDLINE | ID: mdl-38552566

RESEARCH QUESTION: Can a deep learning image analysis model be developed to assess oocyte quality by predicting blastocyst development from images of denuded mature oocytes? DESIGN: A deep learning model was developed utilizing 37,133 static oocyte images with associated laboratory outcomes from eight fertility clinics (six countries). A subset of data (n = 7807) was allocated to test model performance. External model validation was conducted to assess generalizability and robustness on new data (n = 12,357) from two fertility clinics (two countries). Performance was assessed by calculating area under the curve (AUC), balanced accuracy, specificity and sensitivity. Subgroup analyses were performed on the test dataset for age group, male factor and geographical location of the clinic. Model probabilities of the external dataset were converted to a 0-10 scoring scale to facilitate analysis of correlation with blastocyst development and quality. RESULTS: The deep learning model demonstrated AUC of 0.64, balanced accuracy of 0.60, specificity of 0.55 and sensitivity of 0.65 on the test dataset. Subgroup analyses displayed the highest performance for age group 38-39 years (AUC 0.68), a negligible impact of male factor, and good model generalizability across geographical locations. Model performance was confirmed on external data: AUC of 0.63, balanced accuracy of 0.58, specificity of 0.57 and sensitivity of 0.59. Analysis of the scoring scale revealed that higher scoring oocytes correlated with higher likelihood of blastocyst development and good-quality blastocyst formation. CONCLUSION: The deep learning model showed a favourable performance for the evaluation of oocytes in terms of competence to develop into a blastocyst, and when the predictions were converted into scores, they correlated with blastocyst quality. This represents a significant first step in oocyte evaluation for scientific and clinical applications.

...