Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 3 de 3
Filtrar
Más filtros

Bases de datos
Tipo de estudio
Tipo del documento
País de afiliación
Intervalo de año de publicación
1.
Appl Opt ; 62(29): 7611-7620, 2023 Oct 10.
Artículo en Inglés | MEDLINE | ID: mdl-37855468

RESUMEN

For high-precision industrial non-destructive testing, multimodal image registration technology can be employed to register X-ray and neutron images. X-ray and neutron image registration algorithms usually use conventional methods through iterative optimization. These methods will increase the cost of registration time and require more initialization parameters. The imaging results of internal sample structures can suffer from edge blurring due to the influence of a neutron beam collimator aperture, X-ray focal point, and imaging angles. We present an unsupervised learning model, EDIRNet, based on deep learning for deformable registration of X-ray and neutron images. We define the registration process as a function capable of estimating the flow field from input images. By leveraging deep learning techniques, we effectively parameterize this function. Consequently, given a registration image, our optimized network parameters enable rapid and direct estimation of the flow field between the images. We design an attention-based edge enhancement module to enhance the edge features of the image. For evaluating our presented network model, we utilize a dataset including 552 pairs of X-ray and neutron images. The experimental results show that the registration accuracy of EDIRNet reaches 93.09%. Compared with traditional algorithms, the accuracy of EDIRNet is improved by 3.17%, and the registration time is reduced by 28.75 s.

2.
Front Plant Sci ; 14: 1273029, 2023.
Artículo en Inglés | MEDLINE | ID: mdl-38333041

RESUMEN

Disease image classification systems play a crucial role in identifying disease categories in the field of agricultural diseases. However, current plant disease image classification methods can only predict the disease category and do not offer explanations for the characteristics of the predicted disease images. Due to the current situation, this paper employed image description generation technology to produce distinct descriptions for different plant disease categories. A two-stage model called DIC-Transformer, which encompasses three tasks (detection, interpretation, and classification), was proposed. In the first stage, Faster R-CNN was utilized to detect the diseased area and generate the feature vector of the diseased image, with the Swin Transformer as the backbone. In the second stage, the model utilized the Transformer to generate image captions. It then generated the image feature vector, which is weighted by text features, to improve the performance of image classification in the subsequent classification decoder. Additionally, a dataset containing text and visualizations for agricultural diseases (ADCG-18) was compiled. The dataset contains images of 18 diseases and descriptive information about their characteristics. Then, using the ADCG-18, the DIC-Transformer was compared to 11 existing classical caption generation methods and 10 image classification models. The evaluation indicators for captions include Bleu1-4, CiderD, and Rouge. The values of BLEU-1, CIDEr-D, and ROUGE were 0.756, 450.51, and 0.721. The results of DIC-Transformer were 0.01, 29.55, and 0.014 higher than those of the highest-performing comparison model, Fc. The classification evaluation metrics include accuracy, recall, and F1 score, with accuracy at 0.854, recall at 0.854, and F1 score at 0.853. The results of DIC-Transformer were 0.024, 0.078, and 0.075 higher than those of the highest-performing comparison model, MobileNetV2. The results indicate that the DIC-Transformer outperforms other comparison models in classification and caption generation.

3.
J Cancer Res Ther ; 16(4): 867-873, 2020.
Artículo en Inglés | MEDLINE | ID: mdl-32930132

RESUMEN

OBJECTIVE: The objective of this paper was to investigate hub genes of postmenopausal osteoporosis (PO) utilizing benchmarked dataset and gene regulatory network (GRN). MATERIALS AND METHODS: To achieve this goal, the first step was to benchmark the dataset downloaded from the ArrayExpress database by adding local noise and global noise. Second, differentially expressed genes (DEGs) between PO and normal controls were identified using the Linear Models for Microarray Data package based on benchmarked dataset. Third, five kinds of GRN inference methods, which comprised Zscore, GeneNet, context likelihood of relatedness (CLR) algorithm, Partial Correlation coefficient with Information Theory (PCIT), and GEne Network Inference with Ensemble of trees (Genie3), were described and evaluated by receiver operating characteristic (ROC) and precision and recall (PR) curves. Finally, GRN constructed according to the method with best performance was implemented to conduct topological centrality (closeness) for the purpose of investigate hub genes of PO. RESULTS: A total of 236 DEGs were obtained based on benchmarked dataset of 20,554 genes. By assessing Zscore, GeneNet, CLR, PCIT, and Genie3 on the basis of ROC and PR curves, Genie3 had a clear advantage than others and was applied to construct the GRN which was composed of 236 nodes and 27,730 edges. Closeness centrality analysis of GRN was carried out, and we identified 14 hub genes (such as TTN, ACTA1, and MYBPC1) for PO. CONCLUSION: In conclusion, we have identified 14 hub genes (such as TN, ACTA1, and MYBPC1) based on benchmarked dataset and GRN. These genes might be potential biomarkers and give insights for diagnose and treatment of PO.


Asunto(s)
Redes Reguladoras de Genes , Osteoporosis Posmenopáusica/genética , Algoritmos , Benchmarking , Biomarcadores/metabolismo , Biología Computacional/métodos , Bases de Datos Genéticas , Femenino , Perfilación de la Expresión Génica/métodos , Humanos , Osteoporosis Posmenopáusica/metabolismo , Osteoporosis Posmenopáusica/patología , Mapas de Interacción de Proteínas , Curva ROC
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA