Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 2 de 2
Filtrar
Más filtros

Banco de datos
Tipo del documento
Intervalo de año de publicación
1.
Appl Plant Sci ; 8(7): e11373, 2020 Jul.
Artículo en Inglés | MEDLINE | ID: mdl-32765972

RESUMEN

PREMISE: Weed removal in agriculture is typically achieved using herbicides. The use of autonomous robots to reduce weeds is a promising alternative solution, although their implementation requires the precise detection and identification of crops and weeds to allow an efficient action. METHODS: We trained and evaluated an instance segmentation convolutional neural network aimed at segmenting and identifying each plant specimen visible in images produced by agricultural robots. The resulting data set comprised field images on which the outlines of 2489 specimens from two crop species and four weed species were manually drawn. We adjusted the hyperparameters of a mask region-based convolutional neural network (R-CNN) to this specific task and evaluated the resulting trained model. RESULTS: The probability of detection using the model was quite good but varied significantly depending on the species and size of the plants. In practice, between 10% and 60% of weeds could be removed without too high of a risk of confusion with crop plants. Furthermore, we show that the segmentation of each plant enabled the determination of precise action points such as the barycenter of the plant surface. DISCUSSION: Instance segmentation opens many possibilities for optimized weed removal actions. Weed electrification, for instance, could benefit from the targeted adjustment of the voltage, frequency, and location of the electrode to the plant. The results of this work will enable the evaluation of this type of weeding approach in the coming months.

2.
Appl Plant Sci ; 8(6): e11368, 2020 Jun.
Artículo en Inglés | MEDLINE | ID: mdl-32626610

RESUMEN

PREMISE: Herbarium specimens represent an outstanding source of material with which to study plant phenological changes in response to climate change. The fine-scale phenological annotation of such specimens is nevertheless highly time consuming and requires substantial human investment and expertise, which are difficult to rapidly mobilize. METHODS: We trained and evaluated new deep learning models to automate the detection, segmentation, and classification of four reproductive structures of Streptanthus tortuosus (flower buds, flowers, immature fruits, and mature fruits). We used a training data set of 21 digitized herbarium sheets for which the position and outlines of 1036 reproductive structures were annotated manually. We adjusted the hyperparameters of a mask R-CNN (regional convolutional neural network) to this specific task and evaluated the resulting trained models for their ability to count reproductive structures and estimate their size. RESULTS: The main outcome of our study is that the performance of detection and segmentation can vary significantly with: (i) the type of annotations used for training, (ii) the type of reproductive structures, and (iii) the size of the reproductive structures. In the case of Streptanthus tortuosus, the method can provide quite accurate estimates (77.9% of cases) of the number of reproductive structures, which is better estimated for flowers than for immature fruits and buds. The size estimation results are also encouraging, showing a difference of only a few millimeters between the predicted and actual sizes of buds and flowers. DISCUSSION: This method has great potential for automating the analysis of reproductive structures in high-resolution images of herbarium sheets. Deeper investigations regarding the taxonomic scalability of this approach and its potential improvement will be conducted in future work.

SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA