RESUMO
Previous research has demonstrated the efficacy of prescription fungicide programs, based upon Peanut Rx, to reduce combined effects of early leaf spot (ELS), caused by Passalora arachidicola (Cercospora arachidicola), and late leaf spot (LLS), caused by Nothopassalora personata (syn. Cercosporidium personatum), but the potential of Peanut Rx to predict each disease has never been formally evaluated. From 2010 to 2016, non-fungicide-treated peanut plots in Georgia and Florida were sampled to monitor the development of ELS and LLS. This resulted in 168 cases (unique combinations of Peanut Rx risk factors) with associated total leaf spot risk points ranging from 40 to 100. Defoliation ranged from 13.9 to 100%, and increased significantly with increasing total risk points (conditional R2 = 0.56; P < 0.001). Leaf spot onset (time in days after planting [DAP] when either leaf spot reached 1% lesion incidence), ELS onset, and LLS onset ranged from 29 to 140, 29 to 142, and 50 to 143 DAP, respectively, and decreased significantly with increasing risk points. Standardized AUDPC of ELS was significantly affected by risk points (conditional R2 = 0.53, P < 0.001), but not for LLS. After removing redundant Peanut Rx factors, planting date, rotation, historical leaf spot prevalence, cultivar, and field history were used as fixed effects in mixed effect regression models to evaluate their contribution to leaf spot, ELS or LLS prediction. Results from mixed effects regression confirmed that the selected Peanut Rx risk factors contributed to the variability of at least one measurement of development of combined or separate epidemics of ELS and LLS, but not all factors affected ELS and LLS equally. Historical leaf spot prevalence, a new potential preplant risk factor, was a consistent predictor of the dominant disease(s) observed in the field. Results presented here demonstrate that Peanut Rx is a very effective tool for predicting leaf spot onset regardless of which leaf spot is predominant, but also suggest that associated risk does not reflect the same development for each disease. These data will be useful for refining thresholds for differentiating high, moderate, and low risk fields, and reevaluating the timing of fungicide applications in reduced input programs with respect to disease onset.
Assuntos
Arachis , Ascomicetos , Agricultura , Arachis/microbiologia , Ascomicetos/fisiologia , Florida , Fungicidas Industriais , Georgia , Fatores de Risco , Estações do AnoRESUMO
Entomopathogenic microbes such as Spodoptera litura nucleopolyhedrovirus (SpltNPV), Metarhizium anisopliae, and Pseudomonas fluorescens are biological agents used for the control of multiple arthropod pests. The objective of this study was to assess their effects on the biological parameters of Spodoptera litura (Lepidoptera: Noctuidae) larvae, and its natural reduviid predator Rhynocoris kumarii (Hemiptera: Reduviidae) under laboratory conditions. Results suggested that P. fluorescens reduced the food consumption index, relative growth rate, approximate digestibility, the efficiency of conversion of ingested food, and the efficiency of conversion of digested food of S. litura third instar larvae compared to prey infected with M. anisopliae and SpltNPV. Both SpltNPV and M. anisopliae caused similar mortality of S. litura life stages after 96 h of observation. To observe the effect of an infected prey diet on predator behavior, infected S. litura larvae were offered to the third, fourth, and fifth instar nymphs of R. kumarii, and their prey handling time, predation rate (number/day/predator), developmental period, and the survival rate was recorded. When the life stages of R. kumarii were offered entomopathogen-infected S. litura larvae, their predation rate was comparable to or higher than the untreated control. The juvenile predator, after feeding on P. fluorescens-infected S. litura larvae, had a significantly longer developmental period (2â»4 days) compared to those fed on larvae infected with other microbial control agents. However, feeding on P. fluorescens alone did not affect the predator nymphal survival rate or the adult sex ratio. Although three entomopathogens had some degree of effect on the biological parameters of R. kumarii, the outcome of this study suggests that integration of reduviids with the tested entomopathogens are a compatible and potentially effective strategy for the management of S. litura populations. However promising, this combined strategy needs to be tested under field conditions to confirm the laboratory findings.
RESUMO
Empirical and mechanistic modeling indicate that pathogens transmitted via aerially dispersed inoculum follow a power law, resulting in dispersive epidemic waves. The spread parameter (b) of the power law model, which is an indicator of the distance of the epidemic wave front from an initial focus per unit time, has been found to be approximately 2 for several animal and plant diseases over a wide range of spatial scales under conditions favorable for disease spread. Although disease spread and epidemic expansion can be influenced by several factors, the stability of the parameter b over multiple epidemic years has not been determined. Additionally, the size of the initial epidemic area is expected to be strongly related to the final epidemic extent for epidemics, but the stability of this relationship is also not well established. Here, empirical data of cucurbit downy mildew epidemics collected from 2008 to 2014 were analyzed using a spatio-temporal model of disease spread that incorporates logistic growth in time with a power law function for dispersal. Final epidemic extent ranged from 4.16 ×108 km2 in 2012 to 6.44 ×108 km2 in 2009. Current epidemic extent became significantly associated (P < 0.0332; 0.56 < R2 < 0.99) with final epidemic area beginning near the end of April, with the association increasing monotonically to 1.0 by the end of the epidemic season in July. The position of the epidemic wave-front became exponentially more distant with time, and epidemic velocity increased linearly with distance. Slopes from the temporal and spatial regression models varied with about a 2.5-fold range across epidemic years. Estimates of b varied substantially ranging from 1.51 to 4.16 across epidemic years. We observed a significant b ×time (or distance) interaction (P < 0.05) for epidemic years where data were well described by the power law model. These results suggest that the spread parameter b may not be stable over multiple epidemic years. However, b ≈ 2 may be considered the lower limit of the distance traveled by epidemic wave-fronts for aerially transmitted pathogens that follow a power law dispersal function.
RESUMO
Pre-planting factors have been associated with the late-season severity of Stagonospora nodorum blotch (SNB), caused by the fungal pathogen Parastagonospora nodorum, in winter wheat (Triticum aestivum). The relative importance of these factors in the risk of SNB has not been determined and this knowledge can facilitate disease management decisions prior to planting of the wheat crop. In this study, we examined the performance of multiple regression (MR) and three machine learning algorithms namely artificial neural networks, categorical and regression trees, and random forests (RF), in predicting the pre-planting risk of SNB in wheat. Pre-planting factors tested as potential predictor variables were cultivar resistance, latitude, longitude, previous crop, seeding rate, seed treatment, tillage type, and wheat residue. Disease severity assessed at the end of the growing season was used as the response variable. The models were developed using 431 disease cases (unique combinations of predictors) collected from 2012 to 2014 and these cases were randomly divided into training, validation, and test datasets. Models were evaluated based on the regression of observed against predicted severity values of SNB, sensitivity-specificity ROC analysis, and the Kappa statistic. A strong relationship was observed between late-season severity of SNB and specific pre-planting factors in which latitude, longitude, wheat residue, and cultivar resistance were the most important predictors. The MR model explained 33% of variability in the data, while machine learning models explained 47 to 79% of the total variability. Similarly, the MR model correctly classified 74% of the disease cases, while machine learning models correctly classified 81 to 83% of these cases. Results show that the RF algorithm, which explained 79% of the variability within the data, was the most accurate in predicting the risk of SNB, with an accuracy rate of 93%. The RF algorithm could allow early assessment of the risk of SNB, facilitating sound disease management decisions prior to planting of wheat.