Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 5 de 5
Filtrar
Más filtros











Base de datos
Intervalo de año de publicación
1.
Parasit Vectors ; 17(1): 372, 2024 Sep 02.
Artículo en Inglés | MEDLINE | ID: mdl-39223629

RESUMEN

Mosquito-borne diseases are a major global health threat. Traditional morphological or molecular methods for identifying mosquito species often require specialized expertise or expensive laboratory equipment. The use of convolutional neural networks (CNNs) to identify mosquito species based on images may offer a promising alternative, but their practical implementation often remains limited. This study explores the applicability of CNNs in classifying mosquito species. It compares the efficacy of body and wing depictions across three image collection methods: a smartphone, macro-lens attached to a smartphone and a professional stereomicroscope. The study included 796 specimens of four morphologically similar Aedes species, Aedes aegypti, Ae. albopictus, Ae. koreicus and Ae. japonicus japonicus. The findings of this study indicate that CNN models demonstrate superior performance in wing-based classification 87.6% (95% CI: 84.2-91.0) compared to body-based classification 78.9% (95% CI: 77.7-80.0). Nevertheless, there are notable limitations of CNNs as they perform reliably across multiple devices only when trained specifically on those devices, resulting in an average decline of mean accuracy by 14%, even with extensive image augmentation. Additionally, we also estimate the required training data volume for effective classification, noting a reduced requirement for wing-based classification compared to body-based methods. Our study underscores the viability of both body and wing classification methods for mosquito species identification while emphasizing the need to address practical constraints in developing accessible classification systems.


Asunto(s)
Aedes , Aprendizaje Profundo , Alas de Animales , Animales , Alas de Animales/anatomía & histología , Aedes/anatomía & histología , Aedes/clasificación , Procesamiento de Imagen Asistido por Computador/métodos , Mosquitos Vectores/clasificación , Mosquitos Vectores/anatomía & histología , Redes Neurales de la Computación , Teléfono Inteligente , Culicidae/clasificación , Culicidae/anatomía & histología
2.
Sci Rep ; 14(1): 3094, 2024 02 07.
Artículo en Inglés | MEDLINE | ID: mdl-38326355

RESUMEN

Accurate species identification is crucial to assess the medical relevance of a mosquito specimen, but requires intensive experience of the observers and well-equipped laboratories. In this proof-of-concept study, we developed a convolutional neural network (CNN) to identify seven Aedes species by wing images, only. While previous studies used images of the whole mosquito body, the nearly two-dimensional wings may facilitate standardized image capture and reduce the complexity of the CNN implementation. Mosquitoes were sampled from different sites in Germany. Their wings were mounted and photographed with a professional stereomicroscope. The data set consisted of 1155 wing images from seven Aedes species as well as 554 wings from different non-Aedes mosquitoes. A CNN was trained to differentiate between Aedes and non-Aedes mosquitoes and to classify the seven Aedes species based on grayscale and RGB images. Image processing, data augmentation, training, validation and testing were conducted in python using deep-learning framework PyTorch. Our best-performing CNN configuration achieved a macro F1 score of 99% to discriminate Aedes from non-Aedes mosquito species. The mean macro F1 score to predict the Aedes species was 90% for grayscale images and 91% for RGB images. In conclusion, wing images are sufficient to identify mosquito species by CNNs.


Asunto(s)
Aedes , Culicidae , Animales , Redes Neurales de la Computación , Alas de Animales , Procesamiento de Imagen Asistido por Computador/métodos , Alemania
3.
Acta Crystallogr D Struct Biol ; 78(Pt 2): 187-195, 2022 Feb 01.
Artículo en Inglés | MEDLINE | ID: mdl-35102884

RESUMEN

Contamination with diffraction from ice crystals can negatively affect, or even impede, macromolecular structure determination, and therefore detecting the resulting artefacts in diffraction data is crucial. However, once the data have been processed it can be very difficult to automatically recognize this problem. To address this, a set of convolutional neural networks named Helcaraxe has been developed which can detect ice-diffraction artefacts in processed diffraction data from macromolecular crystals. The networks outperform previous algorithms and will be available as part of the AUSPEX web server and the CCP4-distributed software.


Asunto(s)
Artefactos , Hielo , Algoritmos , Aprendizaje Automático , Sustancias Macromoleculares/química , Programas Informáticos
5.
bioRxiv ; 2020 Dec 28.
Artículo en Inglés | MEDLINE | ID: mdl-33052340

RESUMEN

During the COVID-19 pandemic, structural biologists rushed to solve the structures of the 28 proteins encoded by the SARS-CoV-2 genome in order to understand the viral life cycle and enable structure-based drug design. In addition to the 204 previously solved structures from SARS-CoV-1, 548 structures covering 16 of the SARS-CoV-2 viral proteins have been released in a span of only 6 months. These structural models serve as the basis for research to understand how the virus hijacks human cells, for structure-based drug design, and to aid in the development of vaccines. However, errors often occur in even the most careful structure determination - and may be even more common among these structures, which were solved quickly and under immense pressure. The Coronavirus Structural Task Force has responded to this challenge by rapidly categorizing, evaluating and reviewing all of these experimental protein structures in order to help downstream users and original authors. In addition, the Task Force provided improved models for key structures online, which have been used by Folding@Home, OpenPandemics, the EU JEDI COVID-19 challenge and others.

SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA