Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 3 de 3
Filtrar
Mais filtros

Base de dados
Ano de publicação
Tipo de documento
Intervalo de ano de publicação
1.
Sensors (Basel) ; 23(6)2023 Mar 08.
Artigo em Inglês | MEDLINE | ID: mdl-36991638

RESUMO

Recent studies indicate that food demand will increase by 35-56% over the period 2010-2050 due to population increase, economic development, and urbanization. Greenhouse systems allow for the sustainable intensification of food production with demonstrated high crop production per cultivation area. Breakthroughs in resource-efficient fresh food production merging horticultural and AI expertise take place with the international competition "Autonomous Greenhouse Challenge". This paper describes and analyzes the results of the third edition of this competition. The competition's goal is the realization of the highest net profit in fully autonomous lettuce production. Two cultivation cycles were conducted in six high-tech greenhouse compartments with operational greenhouse decision-making realized at a distance and individually by algorithms of international participating teams. Algorithms were developed based on time series sensor data of the greenhouse climate and crop images. High crop yield and quality, short growing cycles, and low use of resources such as energy for heating, electricity for artificial light, and CO2 were decisive in realizing the competition's goal. The results highlight the importance of plant spacing and the moment of harvest decisions in promoting high crop growth rates while optimizing greenhouse occupation and resource use. In this paper, images taken with depth cameras (RealSense) for each greenhouse were used by computer vision algorithms (Deepabv3+ implemented in detectron2 v0.6) in deciding optimum plant spacing and the moment of harvest. The resulting plant height and coverage could be accurately estimated with an R2 of 0.976, and a mIoU of 98.2, respectively. These two traits were used to develop a light loss and harvest indicator to support remote decision-making. The light loss indicator could be used as a decision tool for timely spacing. Several traits were combined for the harvest indicator, ultimately resulting in a fresh weight estimation with a mean absolute error of 22 g. The proposed non-invasively estimated indicators presented in this article are promising traits to be used towards full autonomation of a dynamic commercial lettuce growing environment. Computer vision algorithms act as a catalyst in remote and non-invasive sensing of crop parameters, decisive for automated, objective, standardized, and data-driven decision making. However, spectral indexes describing lettuces growth and larger datasets than the currently accessible are crucial to address existing shortcomings between academic and industrial production systems that have been encountered in this work.


Assuntos
Imageamento Tridimensional , Lactuca , Produção Agrícola , Clima , Computadores
2.
Front Plant Sci ; 14: 1233349, 2023.
Artigo em Inglês | MEDLINE | ID: mdl-37662173

RESUMO

A phenotyping pipeline utilising DeepLab was developed for precisely estimating the height, volume, coverage and vegetation indices of European and Japanese varieties. Using this pipeline, the effect of varying UAV height on the precise estimation of potato crop growth properties was evaluated. A UAV fitted with a multispectral camera was flown at a height of 15 m and 30 m in an experimental field where various varieties of potatoes were grown. The properties of plant height, volume and NDVI were evaluated and compared with the manually obtained parameters. Strong linear correlations with R2 of 0.803 and 0.745 were obtained between the UAV obtained plant heights and manually estimated plant height when the UAV was flown at 15 m and 30 m respectively. Furthermore, high linear correlations with an R2 of 0.839 and 0.754 were obtained between the UAV-estimated volume and manually estimated volume when the UAV was flown at 15 m and 30 m respectively. For the vegetation indices, there were no observable differences in the NDVI values obtained from the UAV flown at the two heights. Furthermore, high linear correlations with R2 of 0.930 and 0.931 were obtained between UAV-estimated and manually measured NDVI at 15 m and 30 m respectively. It was found that UAV flown at the lower height had a higher ground sampling distance thus increased resolution leading to more precise estimation of both the height and volume of crops. For vegetation indices, flying the UAV at a higher height had no effect on the precision of NDVI estimates.

3.
Plant Methods ; 19(1): 49, 2023 May 20.
Artigo em Inglês | MEDLINE | ID: mdl-37210517

RESUMO

BACKGROUND: A well-known method for evaluating plant resistance to insects is by measuring insect reproduction or oviposition. Whiteflies are vectors of economically important viral diseases and are, therefore, widely studied. In a common experiment, whiteflies are placed on plants using clip-on-cages, where they can lay hundreds of eggs on susceptible plants in a few days. When quantifying whitefly eggs, most researchers perform manual eye measurements using a stereomicroscope. Compared to other insect eggs, whitefly eggs are many and very tiny, usually 0.2 mm in length and 0.08 mm in width; therefore, this process takes a lot of time and effort with and without prior expert knowledge. Plant insect resistance experiments require multiple replicates from different plant accessions; therefore, an automated and rapid method for quantifying insect eggs can save time and human resources. RESULTS: In this work, a novel automated tool for fast quantification of whitefly eggs is presented to accelerate the determination of plant insect resistance and susceptibility. Leaf images with whitefly eggs were collected from a commercial microscope and a custom-built imaging system. A deep learning-based object detection model was trained using the collected images. The model was incorporated into an automated whitefly egg quantification algorithm, deployed in a web-based application called Eggsplorer. Upon evaluation on a testing dataset, the algorithm was able to achieve a counting accuracy as high as 0.94, r2 of 0.99, and a counting error of ± 3 eggs relative to the actual number of eggs counted by eye. The automatically collected counting results were used to determine the resistance and susceptibility of several plant accessions and were found to yield significantly comparable results as when using the manually collected counts for analysis. CONCLUSION: This is the first work that presents a comprehensive step-by-step method for fast determination of plant insect resistance and susceptibility with the assistance of an automated quantification tool.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA