Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 1 de 1
Filtrar
Más filtros

Banco de datos
Tipo del documento
País de afiliación
Intervalo de año de publicación
1.
Sensors (Basel) ; 22(22)2022 Nov 15.
Artículo en Inglés | MEDLINE | ID: mdl-36433404

RESUMEN

Robust and automated image segmentation in high-throughput image-based plant phenotyping has received considerable attention in the last decade. The possibility of this approach has not been well studied due to the time-consuming manual segmentation and lack of appropriate datasets. Segmenting images of greenhouse and open-field grown crops from the background is a challenging task linked to various factors such as complex background (presence of humans, equipment, devices, and machinery for crop management practices), environmental conditions (humidity, cloudy/sunny, fog, rain), occlusion, low-contrast and variability in crops and pose over time. This paper presents a new ubiquitous deep learning architecture ThelR547v1 (Thermal RGB 547 layers version 1) that segmented each pixel as crop or crop canopy from the background (non-crop) in real time by abstracting multi-scale contextual information with reduced memory cost. By evaluating over 37,328 augmented images (aug1: thermal RGB and RGB), our method achieves mean IoU of 0.94 and 0.87 for leaves and background and mean Bf scores of 0.93 and 0.86, respectively. ThelR547v1 has a training accuracy of 96.27%, a training loss of 0.09, a validation accuracy of 96.15%, and a validation loss of 0.10. Qualitative analysis further shows that despite the low resolution of training data, ThelR547v1 successfully distinguishes leaf/canopy pixels from complex and noisy background pixels, enabling it to be used for real-time semantic segmentation of horticultural crops.


Asunto(s)
Procesamiento de Imagen Asistido por Computador , Semántica , Humanos , Procesamiento de Imagen Asistido por Computador/métodos , Redes Neurales de la Computación , Productos Agrícolas , Horticultura
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA