Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 5 de 5
Filtrar
2.
JCO Clin Cancer Inform ; 6: e2100176, 2022 06.
Artículo en Inglés | MEDLINE | ID: mdl-35749675

RESUMEN

PURPOSE: Clear evidence indicating whether surgery or stereotactic body radiation therapy (SBRT) is best for non-small-cell lung cancer (NSCLC) is lacking. SBRT has many advantages. We used artificial neural networks (NNs) to predict treatment outcomes for patients with NSCLC receiving SBRT, aiming to aid in decision making. PATIENTS AND METHODS: Among consecutive patients receiving SBRT between 2005 and 2019 in our institution, we retrospectively identified those with Tis-T4N0M0 NSCLC. We constructed two NNs for prediction of overall survival (OS) and cancer progression in the first 5 years after SBRT, which were tested using an internal and an external test data set. We performed risk group stratification, wherein 5-year OS and cancer progression were stratified into three groups. RESULTS: In total, 692 patients in our institution and 100 patients randomly chosen in the external institution were enrolled. The NNs resulted in concordance indexes for OS of 0.76 (95% CI, 0.73 to 0.79), 0.68 (95% CI, 0.60 to 0.75), and 0.69 (95% CI, 0.61 to 0.76) and area under the curve for cancer progression of 0.80 (95% CI, 0.75 to 0.84), 0.72 (95% CI, 0.60 to 0.83), and 0.70 (95% CI, 0.57 to 0.81) in the training, internal test, and external test data sets, respectively. The survival and cumulative incidence curves were significantly stratified. NNs selected low-risk cancer progression groups of 5.6%, 6.9%, and 7.0% in the training, internal test, and external test data sets, respectively, suggesting that 48% of patients with peripheral Tis-4N0M0 NSCLC can be at low-risk for cancer progression. CONCLUSION: Predictions of SBRT outcomes using NNs were useful for Tis-4N0M0 NSCLC. Our results are anticipated to open new avenues for NN predictions and provide decision-making guidance for patients and physicians.


Asunto(s)
Carcinoma de Pulmón de Células no Pequeñas , Neoplasias Pulmonares , Radiocirugia , Humanos , Neoplasias Pulmonares/diagnóstico , Neoplasias Pulmonares/radioterapia , Estadificación de Neoplasias , Redes Neurales de la Computación , Radiocirugia/métodos , Estudios Retrospectivos
3.
Radiol Phys Technol ; 14(3): 318-327, 2021 Sep.
Artículo en Inglés | MEDLINE | ID: mdl-34254251

RESUMEN

Deep learning has demonstrated high efficacy for automatic segmentation in contour delineation, which is crucial in radiation therapy planning. However, the collection, labeling, and management of medical imaging data can be challenging. This study aims to elucidate the effects of sample size and data augmentation on the automatic segmentation of computed tomography images using U-Net, a deep learning method. For the chest and pelvic regions, 232 and 556 cases are evaluated, respectively. We investigate multiple conditions by changing the sum of the training and validation datasets across a broad range of values: 10-200 and 10-500 cases for the chest and pelvic regions, respectively. A U-Net is constructed, and horizontal-flip data augmentation, which produces left and right inverse images resulting in twice the number of images, is compared with no augmentation for each training session. All lung cases and more than 100 prostate, bladder, and rectum cases indicate that adding horizontal-flip data augmentation is almost as effective as doubling the number of cases. The slope of the Dice similarity coefficient (DSC) in all organs decreases rapidly until approximately 100 cases, stabilizes after 200 cases, and shows minimal changes as the number of cases is increased further. The DSCs stabilize at a smaller sample size with the incorporation of data augmentation in all organs except the heart. This finding is applicable to the automation of radiation therapy for rare cancers, where large datasets may be difficult to obtain.


Asunto(s)
Próstata , Tomografía Computarizada por Rayos X , Humanos , Pulmón , Masculino , Tamaño de la Muestra , Tórax
4.
Phys Med ; 78: 93-100, 2020 Oct.
Artículo en Inglés | MEDLINE | ID: mdl-32950833

RESUMEN

PURPOSE: Deep learning has shown great efficacy for semantic segmentation. However, there are difficulties in the collection, labeling and management of medical imaging data, because of ethical complications and the limited number of imaging studies available at a single facility. This study aimed to find a simple and low-cost method to increase the accuracy of deep learning semantic segmentation for radiation therapy of prostate cancer. METHODS: In total, 556 cases with non-contrast CT images for prostate cancer radiation therapy were examined using a two-dimensional U-Net. Initially, all slices were used for the input data. Then, we removed slices of the cranial portions, which were beyond the margins of the bladder and rectum. Finally, the ground truth labels for the bladder and rectum were added as channels to the input for the prostate training dataset. RESULTS: The highest mean dice similarity coefficients (DSCs) for each organ in the test dataset of 56 cases were 0.85 ± 0.05, 0.94 ± 0.04 and 0.85 ± 0.07 for the prostate, bladder and rectum, respectively. Removal of the cranial slices from the original images significantly increased the DSC of the rectum from 0.83 ± 0.09 to 0.85 ± 0.07 (p < 0.05). Adding bladder and rectum information to prostate training without removing the slices significantly increased the DSC of the prostate from 0.79 ± 0.05 to 0.85 ± 0.05 (p < 0.05). CONCLUSIONS: These cost-free approaches may be useful for new applications, which may include updated models and datasets. They may be applicable to other organs at risk (OARs) and clinical targets such as elective nodal irradiation.


Asunto(s)
Aprendizaje Profundo , Neoplasias de la Próstata , Humanos , Procesamiento de Imagen Asistido por Computador , Masculino , Neoplasias de la Próstata/diagnóstico por imagen , Neoplasias de la Próstata/radioterapia , Semántica , Tomografía Computarizada por Rayos X
5.
J Radiat Res ; 61(2): 257-264, 2020 Mar 23.
Artículo en Inglés | MEDLINE | ID: mdl-32043528

RESUMEN

This study aimed to examine the efficacy of semantic segmentation implemented by deep learning and to confirm whether this method is more effective than a commercially dominant auto-segmentation tool with regards to delineating normal lung excluding the trachea and main bronchi. A total of 232 non-small-cell lung cancer cases were examined. The computed tomography (CT) images of these cases were converted from Digital Imaging and Communications in Medicine (DICOM) Radiation Therapy (RT) formats to arrays of 32 × 128 × 128 voxels and input into both 2D and 3D U-Net, which are deep learning networks for semantic segmentation. The number of training, validation and test sets were 160, 40 and 32, respectively. Dice similarity coefficients (DSCs) of the test set were evaluated employing Smart SegmentationⓇ Knowledge Based Contouring (Smart segmentation is an atlas-based segmentation tool), as well as the 2D and 3D U-Net. The mean DSCs of the test set were 0.964 [95% confidence interval (CI), 0.960-0.968], 0.990 (95% CI, 0.989-0.992) and 0.990 (95% CI, 0.989-0.991) with Smart segmentation, 2D and 3D U-Net, respectively. Compared with Smart segmentation, both U-Nets presented significantly higher DSCs by the Wilcoxon signed-rank test (P < 0.01). There was no difference in mean DSC between the 2D and 3D U-Net systems. The newly-devised 2D and 3D U-Net approaches were found to be more effective than a commercial auto-segmentation tool. Even the relatively shallow 2D U-Net which does not require high-performance computational resources was effective enough for the lung segmentation. Semantic segmentation using deep learning was useful in radiation treatment planning for lung cancers.


Asunto(s)
Bronquios/diagnóstico por imagen , Procesamiento de Imagen Asistido por Computador , Pulmón/diagnóstico por imagen , Semántica , Tráquea/diagnóstico por imagen , Algoritmos , Humanos , Imagenología Tridimensional , Tomografía Computarizada por Rayos X
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA
...