Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 7 de 7
Filtrar
Más filtros











Base de datos
Intervalo de año de publicación
1.
Med Biol Eng Comput ; 2024 May 09.
Artículo en Inglés | MEDLINE | ID: mdl-38722478

RESUMEN

The accurate selection of the ultrasound plane for the fetal head and pubic symphysis is critical for precisely measuring the angle of progression. The traditional method depends heavily on sonographers manually selecting the imaging plane. This process is not only time-intensive and laborious but also prone to variability based on the clinicians' expertise. Consequently, there is a significant need for an automated method driven by artificial intelligence. To enhance the efficiency and accuracy of identifying the pubic symphysis-fetal head standard plane (PSFHSP), we proposed a streamlined neural network, PSFHSP-Net, based on a modified version of ResNet-18. This network comprises a single convolutional layer and three residual blocks designed to mitigate noise interference and bolster feature extraction capabilities. The model's adaptability was further refined by expanding the shared feature layer into task-specific layers. We assessed its performance against both traditional heavyweight and other lightweight models by evaluating metrics such as F1-score, accuracy (ACC), recall, precision, area under the ROC curve (AUC), model parameter count, and frames per second (FPS). The PSFHSP-Net recorded an ACC of 0.8995, an F1-score of 0.9075, a recall of 0.9191, and a precision of 0.9022. This model surpassed other heavyweight and lightweight models in these metrics. Notably, it featured the smallest model size (1.48 MB) and the highest processing speed (65.7909 FPS), meeting the real-time processing criterion of over 24 images per second. While the AUC of our model was 0.930, slightly lower than that of ResNet34 (0.935), it showed a marked improvement over ResNet-18 in testing, with increases in ACC and F1-score of 0.0435 and 0.0306, respectively. However, precision saw a slight decrease from 0.9184 to 0.9022, a reduction of 0.0162. Despite these trade-offs, the compression of the model significantly reduced its size from 42.64 to 1.48 MB and increased its inference speed by 4.4753 to 65.7909 FPS. The results confirm that the PSFHSP-Net is capable of swiftly and effectively identifying the PSFHSP, thereby facilitating accurate measurements of the angle of progression. This development represents a significant advancement in automating fetal imaging analysis, promising enhanced consistency and reduced operator dependency in clinical settings.

2.
Int J Comput Assist Radiol Surg ; 18(8): 1489-1500, 2023 Aug.
Artículo en Inglés | MEDLINE | ID: mdl-36853584

RESUMEN

PURPOSE: In recent years, breast cancer has become the greatest threat to women. There are many studies dedicated to the precise segmentation of breast tumors, which is indispensable in computer-aided diagnosis. Deep neural networks have achieved accurate segmentation of images. However, convolutional layers are biased to extract local features and tend to lose global and location information as the network deepens, which leads to a decrease in breast tumors segmentation accuracy. For this reason, we propose a hybrid attention-guided network (HAG-Net). We believe that this method will improve the detection rate and segmentation of tumors in breast ultrasound images. METHODS: The method is equipped with multi-scale guidance block (MSG) for guiding the extraction of low-resolution location information. Short multi-head self-attention (S-MHSA) and convolutional block attention module are used to capture global features and long-range dependencies. Finally, the segmentation results are obtained by fusing multi-scale contextual information. RESULTS: We compare with 7 state-of-the-art methods on two publicly available datasets through five random fivefold cross-validations. The highest dice coefficient, Jaccard Index and detect rate ([Formula: see text]%, [Formula: see text]%, [Formula: see text]% and [Formula: see text]%, [Formula: see text]%, [Formula: see text]%, separately) obtained on two publicly available datasets(BUSI and OASUBD), prove the superiority of our method. CONCLUSION: HAG-Net can better utilize multi-resolution features to localize the breast tumors. Demonstrating excellent generalizability and applicability for breast tumors segmentation compare to other state-of-the-art methods.


Asunto(s)
Neoplasias de la Mama , Procesamiento de Imagen Asistido por Computador , Humanos , Femenino , Procesamiento de Imagen Asistido por Computador/métodos , Ultrasonografía Mamaria , Redes Neurales de la Computación , Neoplasias de la Mama/diagnóstico por imagen , Diagnóstico por Computador
3.
Med Biol Eng Comput ; 61(5): 1017-1031, 2023 May.
Artículo en Inglés | MEDLINE | ID: mdl-36645647

RESUMEN

The generalization ability of the fetal head segmentation method is reduced due to the data obtained by different machines, settings, and operations. To keep the generalization ability, we proposed a Fourier domain adaptation (FDA) method based on amplitude and phase to achieve better multi-source ultrasound data segmentation performance. Given the source/target image, the Fourier domain information was first obtained using fast Fourier transform. Secondly, the target information was mapped to the source Fourier domain through the phase adjustment parameter α and the amplitude adjustment parameter ß. Thirdly, the target image and the preprocessed source image obtained through the inverse discrete Fourier transform were used as the input of the segmentation network. Finally, the dice loss was computed to adjust α and ß. In the existing transform methods, the proposed method achieved the best performance. The adaptive-FDA method provides a solution for the automatic preprocessing of multi-source data. Experimental results show that it quantitatively improves the segmentation results and model generalization performance.


Asunto(s)
Cabeza , Ultrasonografía Prenatal , Femenino , Embarazo , Humanos , Ultrasonografía , Cabeza/diagnóstico por imagen , Procesamiento de Imagen Asistido por Computador/métodos
4.
Front Physiol ; 13: 940150, 2022.
Artículo en Inglés | MEDLINE | ID: mdl-36531181

RESUMEN

Background: Accurate assessment of fetal descent by monitoring the fetal head (FH) station remains a clinical challenge in guiding obstetric management. Angle of progression (AoP) has been suggested to be a reliable and reproducible parameter for the assessment of FH descent. Methods: A novel framework, including image segmentation, target fitting and AoP calculation, is proposed for evaluating fetal descent. For image segmentation, this study presents a novel double branch segmentation network (DBSN), which consists of two parts: an encoding part receives image input, and a decoding part composed of deformable convolutional blocks and ordinary convolutional blocks. The decoding part includes the lower and upper branches, and the feature map of the lower branch is used as the input of the upper branch to assist the upper branch in decoding after being constrained by the attention gate (AG). Given an original transperineal ultrasound (TPU) image, areas of the pubic symphysis (PS) and FH are firstly segmented using the proposed DBSN, the ellipse contours of segmented regions are secondly fitted with the least square method, and three endpoints are finally determined for calculating AoP. Results: Our private dataset with 313 transperineal ultrasound (TPU) images was used for model evaluation with 5-fold cross-validation. The proposed method achieves the highest Dice coefficient (93.4%), the smallest Average Surface Distance (6.268 pixels) and the lowest AoP difference (5.993°) by comparing four state-of-the-art methods. Similar results (Dice coefficient: 91.7%, Average Surface Distance: 7.729 pixels: AoP difference: 5.110°) were obtained on a public dataset with >3,700 TPU images for evaluating its generalization performance. Conclusion: The proposed framework may be used for the automatic measurement of AoP with high accuracy and generalization performance. However, its clinical availability needs to be further evaluated.

5.
Comput Math Methods Med ; 2022: 5192338, 2022.
Artículo en Inglés | MEDLINE | ID: mdl-36092792

RESUMEN

The angle of progression (AoP) for assessing fetal head (FH) descent during labor is measured from the standard plane of transperineal ultrasound images as the angle between a line through the long axis of pubic symphysis (PS) and a second line from the right end of PS tangentially to the contour of the FH. This paper presents a multitask network with a shared feature encoder and three task-special decoders for standard plane recognition (Task1), image segmentation (Task2) of PS and FH, and endpoint detection (Task3) of PS. Based on the segmented FH and two endpoints of PS from standard plane images, we determined the right FH tangent point that passes through the right endpoint of PS and then computed the AoP using the above three points. In this paper, the efficient channel attention unit is introduced into the shared feature encoder for improving the robustness of layer region encoding, while an attention fusion module is used to promote cross-branch interaction between the encoder for Task2 and that for Task3, and a shape-constrained loss function is designed for enhancing the robustness to noise based on the convex shape-prior. We use Pearson's correlation coefficient and the Bland-Altman graph to assess the degree of agreement. The dataset includes 1964 images, where 919 images are nonstandard planes, and the other 1045 images are standard planes including PS and FH. We achieve a classification accuracy of 92.26%, and for the AoP calculation, an absolute mean (STD) value of the difference in AoP (∆AoP) is 3.898° (3.192°), the Pearson's correlation coefficient between manual and automated AoP was 0.964 and the Bland-Altman plot demonstrates they were statistically significant (P < 0.05). In conclusion, our approach can achieve a fully automatic measurement of AoP with good efficiency and may help labor progress in the future.


Asunto(s)
Presentación en Trabajo de Parto , Ultrasonografía Prenatal , Femenino , Feto/diagnóstico por imagen , Humanos , Redes Neurales de la Computación , Embarazo , Reproducibilidad de los Resultados , Ultrasonografía Prenatal/métodos
7.
Data Brief ; 41: 107904, 2022 Apr.
Artículo en Inglés | MEDLINE | ID: mdl-35198683

RESUMEN

The use of transperineal ultrasound techniques for the assessment of fetal head descent and progression is an adjunct to clinical examination. Automatic identification of parameters based on ultrasound images will greatly reduce the subjectivity and non-repeatability of the clinician's judgment. However, the lack of a pubic symphysis-fetal head dataset hinders the development of algorithms. Here, we present an intrapartum transperineal ultrasound dataset of the Intelligent Fetal Monitoring Lab of Jinan University (named the JNU-IFM dataset), in which intrapartum transperineal ultrasound videos of 78 were recorded from 51 patients. These data were obtained with the Youkey D8 wireless 2D ultrasound probe with its corresponding supporting software by Wuhan Youkey Bio-Medical Electronics Co., Ltd., Wuhan, China. In these videos, 6224 high-quality images with four categories were selected to form the JNU- IFM dataset. These images were labelled using the Pair software and then validated by two experienced radiologists. We hope that this data set can be used in the segmentation of the pubic symphysis-fetal head.

SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA