Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 3 de 3
Filtrar
Mais filtros

Base de dados
Ano de publicação
Tipo de documento
Intervalo de ano de publicação
1.
Planta ; 259(5): 116, 2024 Apr 09.
Artigo em Inglês | MEDLINE | ID: mdl-38592549

RESUMO

MAIN CONCLUSION: Differentially expressed microRNAs were found associated with the development of chasmogamous and cleistogamous flowers in Viola prionantha, revealing potential roles of microRNAs in the developmental evolution of dimorphic flowers. In Viola prionantha, chasmogamous (CH) flowers are induced by short daylight, while cleistogamous (CL) flowers are triggered by long daylight. How environmental factors and microRNAs (miRNAs) affect dimorphic flower formation remains unknown. In this study, small RNA sequencing was performed on CH and CL floral buds at different developmental stages in V. prionantha, differentially expressed miRNAs (DEmiRNAs) were identified, and their target genes were predicted. In CL flowers, Viola prionantha miR393 (vpr-miR393a/b) and vpr-miRN3366 were highly expressed, while in CH flowers, vpr-miRN2005, vpr-miR172e-2, vpr-miR166m-3, vpr-miR396f-2, and vpr-miR482d-2 were highly expressed. In the auxin-activated signaling pathway, vpr-miR393a/b and vpr-miRN2005 could target Vpr-TIR1/AFB and Vpr-ARF2, respectively, and other DEmiRNAs could target genes involved in the regulation of transcription, e.g., Vpr-AP2-7. Moreover, Vpr-UFO and Vpr-YAB5, the main regulators in petal and stamen development, were co-expressed with Vpr-TIR1/AFB and Vpr-ARF2 and showed lower expression in CL flowers than in CH flowers. Some V. prionantha genes relating to the stress/defense responses were co-expressed with Vpr-TIR1/AFB, Vpr-ARF2, and Vpr-AP2-7 and highly expressed in CL flowers. Therefore, in V. prionantha, CH-CL flower development may be regulated by the identified DEmiRNAs and their target genes, thus providing the first insight into the formation of dimorphic flowers in Viola.


Assuntos
MicroRNAs , Viola , Flores/genética , MicroRNAs/genética , Reprodução , Análise de Sequência de RNA
2.
Plants (Basel) ; 13(13)2024 Jul 03.
Artigo em Inglês | MEDLINE | ID: mdl-38999667

RESUMO

Maize, as one of the most important crops in the world, faces severe challenges from various diseases and pests. The timely and accurate identification of maize leaf diseases and pests is of great significance for ensuring agricultural production. Currently, the identification of maize leaf diseases and pests faces two key challenges: (1) In the actual process of identifying leaf diseases and pests, complex backgrounds can interfere with the identification effect. (2) The subtle features of diseases and pests are difficult to accurately extract. To address these challenges, this study proposes a maize leaf disease and pest identification model called LFMNet. Firstly, the localized multi-scale inverted residual convolutional block (LMSB) is proposed to perform preliminary down-sampling on the image, preserving important feature information for the subsequent extraction of fine disease and pest features in the model structure. Then, the feature localization bottleneck (FLB) is proposed to improve the model's ability to focus on and locate disease and pest characteristics and to reduce interference from complex backgrounds. Subsequently, the multi-hop local-feature fusion architecture (MLFFA) is proposed, which effectively addresses the problem of extracting subtle features by enhancing the extraction and fusion of global and local disease and pest features in images. After training and testing on a dataset containing 19,451 images of maize leaf diseases and pests, the LFMNet model demonstrated excellent performance, with an average identification accuracy of 95.68%, a precision of 95.91%, a recall of 95.78%, and an F1 score of 95.83%. Compared to existing models, it exhibits significant advantages, offering robust technical support for the precise identification of maize diseases and pests.

3.
Plants (Basel) ; 13(14)2024 Jul 19.
Artigo em Inglês | MEDLINE | ID: mdl-39065507

RESUMO

Accurate peach detection is essential for automated agronomic management, such as mechanical peach harvesting. However, ubiquitous occlusion makes identifying peaches from complex backgrounds extremely challenging. In addition, it is difficult to capture fine-grained peach features from a single RGB image, which can suffer from light and noise in scenarios with dense small target clusters and extreme light. To solve these problems, this study proposes a multimodal detector, called CRLNet, based on RGB and depth images. First, YOLOv9 was extended to design a backbone network that can extract RGB and depth features in parallel from an image. Second, to address the problem of information fusion bias, the Rough-Fine Hybrid Attention Fusion Module (RFAM) was designed to combine the advantageous information of different modes while suppressing the hollow noise at the edge of the peach. Finally, a Transformer-based Local-Global Joint Enhancement Module (LGEM) was developed to jointly enhance the local and global features of peaches using information from different modalities in order to enhance the percentage of information about the target peaches and remove the interference of redundant background information. CRLNet was trained on the Peach dataset and evaluated against other state-of-the-art methods; the model achieved an mAP50 of 97.1%. In addition, CRLNet also achieved an mAP50 of 92.4% in generalized experiments, validating its strong generalization capability. These results provide valuable insights for peach and other outdoor fruit multimodal detection.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA