RESUMEN
Rhizoctonia crown and root rot (RCRR), caused by Rhizoctonia solani, can cause severe yield and quality losses in sugar beet. The most common strategy to control the disease is the development of resistant varieties. In the breeding process, field experiments with artificial inoculation are carried out to evaluate the performance of genotypes and varieties. The phenotyping process in breeding trials requires constant monitoring and scoring by skilled experts. This work is time demanding and shows bias and heterogeneity according to the experience and capacity of each individual person. Optical sensors and artificial intelligence have demonstrated great potential to achieve higher accuracy than human raters and the possibility to standardize phenotyping applications. A workflow combining red-green-blue and multispectral imagery coupled to an unmanned aerial vehicle (UAV), as well as machine learning techniques, was applied to score diseased plants and plots affected by RCRR. Georeferenced annotation of UAV-orthorectified images was carried out. With the annotated images, five convolutional neural networks were trained to score individual plants. The training was carried out with different image analysis strategies and data augmentation. The custom convolutional neural network trained from scratch together with pretrained MobileNet showed the best precision in scoring RCRR (0.73 to 0.85). The average per plot of spectral information was used to score the plots, and the benefit of adding the information obtained from the score of individual plants was compared. For this purpose, machine learning models were trained together with data management strategies, and the best-performing model was chosen. A combined pipeline of random forest and k-nearest neighbors has shown the best weighted precision (0.67). This research provides a reliable workflow for detecting and scoring RCRR based on aerial imagery. RCRR is often distributed heterogeneously in trial plots; therefore, considering the information from individual plants of the plots showed a significant improvement in UAV-based automated monitoring routines.
Asunto(s)
Beta vulgaris , Dispositivos Aéreos No Tripulados , Humanos , Rhizoctonia , Inteligencia Artificial , Fitomejoramiento , Aprendizaje Automático , Verduras , AzúcaresRESUMEN
Fungal infections trigger defense or signaling responses in plants, leading to various changes in plant metabolites. The changes in metabolites, for example chlorophyll or flavonoids, have long been detectable using time-consuming destructive analytical methods including high-performance liquid chromatography or photometric determination. Recent plant phenotyping studies have revealed that hyperspectral imaging (HSI) in the UV range can be used to link spectral changes with changes in plant metabolites. To compare established destructive analytical methods with new nondestructive hyperspectral measurements, the interaction between sugar beet leaves and the pathogens Cercospora beticola, which causes Cercospora leaf spot disease (CLS), and Uromyces betae, which causes sugar beet rust (BR), was investigated. With the help of destructive analyses, we showed that both diseases have different effects on chlorophylls, carotenoids, flavonoids, and several phenols. Nondestructive hyperspectral measurements in the UV range revealed different effects of CLS and BR on plant metabolites resulting in distinct reflectance patterns. Both diseases resulted in specific spectral changes that allowed differentiation between the two diseases. Machine learning algorithms enabled the differentiation between the symptom classes and recognition of the two sugar beet diseases. Feature importance analysis identified specific wavelengths important to the classification, highlighting the utility of the UV range. The study demonstrates that HSI in the UV range is a promising, nondestructive tool to investigate the influence of plant diseases on plant physiology and biochemistry.
Asunto(s)
Ascomicetos , Beta vulgaris , Ascomicetos/fisiología , Beta vulgaris/microbiología , Imágenes Hiperespectrales , Enfermedades de las Plantas/microbiología , Verduras , AzúcaresRESUMEN
Disease incidence (DI) and metrics of disease severity are relevant parameters for decision making in plant protection and plant breeding. To develop automated and sensor-based routines, a sugar beet variety trial was inoculated with Cercospora beticola and monitored with a multispectral camera system mounted to an unmanned aerial vehicle (UAV) over the vegetation period. A pipeline based on machine learning methods was established for image data analysis and extraction of disease-relevant parameters. Features based on the digital surface model, vegetation indices, shadow condition, and image resolution improved classification performance in comparison with using single multispectral channels in 12 and 6% of diseased and soil regions, respectively. With a postprocessing step, area-related parameters were computed after classification. Results of this pipeline also included extraction of DI and disease severity (DS) from UAV data. The calculated area under disease progress curve of DS was 2,810.4 to 7,058.8%.days for human visual scoring and 1,400.5 to 4,343.2%.days for UAV-based scoring. Moreover, a sharper differentiation of varieties compared with visual scoring was observed in area-related parameters such as area of complete foliage (AF), area of healthy foliage (AH), and mean area of lesion by unit of foliage ([Formula: see text]). These advantages provide the option to replace the laborious work of visual disease assessments in the field with a more precise, nondestructive assessment via multispectral data acquired by UAV flights.[Formula: see text] Copyright © 2023 The Author(s). This is an open access article distributed under the CC BY-NC-ND 4.0 International license.