Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 3 de 3
Filtrar
Mais filtros











Base de dados
Intervalo de ano de publicação
1.
Plants (Basel) ; 13(14)2024 Jul 19.
Artigo em Inglês | MEDLINE | ID: mdl-39065507

RESUMO

Accurate peach detection is essential for automated agronomic management, such as mechanical peach harvesting. However, ubiquitous occlusion makes identifying peaches from complex backgrounds extremely challenging. In addition, it is difficult to capture fine-grained peach features from a single RGB image, which can suffer from light and noise in scenarios with dense small target clusters and extreme light. To solve these problems, this study proposes a multimodal detector, called CRLNet, based on RGB and depth images. First, YOLOv9 was extended to design a backbone network that can extract RGB and depth features in parallel from an image. Second, to address the problem of information fusion bias, the Rough-Fine Hybrid Attention Fusion Module (RFAM) was designed to combine the advantageous information of different modes while suppressing the hollow noise at the edge of the peach. Finally, a Transformer-based Local-Global Joint Enhancement Module (LGEM) was developed to jointly enhance the local and global features of peaches using information from different modalities in order to enhance the percentage of information about the target peaches and remove the interference of redundant background information. CRLNet was trained on the Peach dataset and evaluated against other state-of-the-art methods; the model achieved an mAP50 of 97.1%. In addition, CRLNet also achieved an mAP50 of 92.4% in generalized experiments, validating its strong generalization capability. These results provide valuable insights for peach and other outdoor fruit multimodal detection.

2.
Plants (Basel) ; 13(13)2024 Jul 03.
Artigo em Inglês | MEDLINE | ID: mdl-38999667

RESUMO

Maize, as one of the most important crops in the world, faces severe challenges from various diseases and pests. The timely and accurate identification of maize leaf diseases and pests is of great significance for ensuring agricultural production. Currently, the identification of maize leaf diseases and pests faces two key challenges: (1) In the actual process of identifying leaf diseases and pests, complex backgrounds can interfere with the identification effect. (2) The subtle features of diseases and pests are difficult to accurately extract. To address these challenges, this study proposes a maize leaf disease and pest identification model called LFMNet. Firstly, the localized multi-scale inverted residual convolutional block (LMSB) is proposed to perform preliminary down-sampling on the image, preserving important feature information for the subsequent extraction of fine disease and pest features in the model structure. Then, the feature localization bottleneck (FLB) is proposed to improve the model's ability to focus on and locate disease and pest characteristics and to reduce interference from complex backgrounds. Subsequently, the multi-hop local-feature fusion architecture (MLFFA) is proposed, which effectively addresses the problem of extracting subtle features by enhancing the extraction and fusion of global and local disease and pest features in images. After training and testing on a dataset containing 19,451 images of maize leaf diseases and pests, the LFMNet model demonstrated excellent performance, with an average identification accuracy of 95.68%, a precision of 95.91%, a recall of 95.78%, and an F1 score of 95.83%. Compared to existing models, it exhibits significant advantages, offering robust technical support for the precise identification of maize diseases and pests.

3.
Plants (Basel) ; 13(11)2024 Jun 06.
Artigo em Inglês | MEDLINE | ID: mdl-38891389

RESUMO

Pepper is a high-economic-value agricultural crop that faces diverse disease challenges such as blight and anthracnose. These diseases not only reduce the yield of pepper but, in severe cases, can also cause significant economic losses and threaten food security. The timely and accurate identification of pepper diseases is crucial. Image recognition technology plays a key role in this aspect by automating and efficiently identifying pepper diseases, helping agricultural workers to adopt and implement effective control strategies, alleviating the impact of diseases, and being of great importance for improving agricultural production efficiency and promoting sustainable agricultural development. In response to issues such as edge-blurring and the extraction of minute features in pepper disease image recognition, as well as the difficulty in determining the optimal learning rate during the training process of traditional pepper disease identification networks, a new pepper disease recognition model based on the TPSAO-AMWNet is proposed. First, an Adaptive Residual Pyramid Convolution (ARPC) structure combined with a Squeeze-and-Excitation (SE) module is proposed to solve the problem of edge-blurring by utilizing adaptivity and channel attention; secondly, to address the issue of micro-feature extraction, Minor Triplet Disease Focus Attention (MTDFA) is proposed to enhance the capture of local details of pepper leaf disease features while maintaining attention to global features, reducing interference from irrelevant regions; then, a mixed loss function combining Weighted Focal Loss and L2 regularization (WfrLoss) is introduced to refine the learning strategy during dataset processing, enhancing the model's performance and generalization capabilities while preventing overfitting. Subsequently, to tackle the challenge of determining the optimal learning rate, the tent particle snow ablation optimizer (TPSAO) is developed to accurately identify the most effective learning rate. The TPSAO-AMWNet model, trained on our custom datasets, is evaluated against other existing methods. The model attains an average accuracy of 93.52% and an F1 score of 93.15%, demonstrating robust effectiveness and practicality in classifying pepper diseases. These results also offer valuable insights for disease detection in various other crops.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA