Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 96
Filtrar
1.
Phytopathology ; 114(3): 590-602, 2024 Mar.
Artigo em Inglês | MEDLINE | ID: mdl-38079394

RESUMO

Growers often use alternations or mixtures of fungicides to slow down the development of resistance to fungicides. However, within a landscape, some growers will implement such resistance management methods, whereas others do not, and may even apply solo components of the resistance management program. We investigated whether growers using solo components of resistant management programs affect the durability of disease control in fields of those who implement fungicide resistance management. We developed a spatially implicit semidiscrete epidemiological model for the development of fungicide resistance. The model simulates the development of epidemics of spot-form net blotch disease, caused by the pathogen Pyrenophora teres f. maculata. The landscape comprises three types of fields, grouped according to their treatment program, with spore dispersal between fields early in the cropping season. In one field type, a fungicide resistance management method is implemented, whereas in the two others, it is not, with one of these field types using a component of the fungicide resistance management program. The output of the model suggests that the use of component fungicides does affect the durability of disease control for growers using resistance management programs. The magnitude of the effect depends on the characteristics of the pathosystem, the degree of inoculum mixing between fields, and the resistance management program being used. Additionally, although increasing the amount of the solo component in the landscape generally decreases the lifespan within which the resistance management program provides effective control, situations exist where the lifespan may be minimized at intermediate levels of the solo component fungicide. [Formula: see text] Copyright © 2024 The Author(s). This is an open access article distributed under the CC BY 4.0 International license.


Assuntos
Ascomicetos , Fungicidas Industriais , Hordeum , Fungicidas Industriais/farmacologia , Austrália Ocidental , Doenças das Plantas/prevenção & controle
2.
PLoS Biol ; 18(10): e3000863, 2020 10.
Artigo em Inglês | MEDLINE | ID: mdl-33044954

RESUMO

Emerging infectious diseases (EIDs) of plants continue to devastate ecosystems and livelihoods worldwide. Effective management requires surveillance to detect epidemics at an early stage. However, despite the increasing use of risk-based surveillance programs in plant health, it remains unclear how best to target surveillance resources to achieve this. We combine a spatially explicit model of pathogen entry and spread with a statistical model of detection and use a stochastic optimisation routine to identify which arrangement of surveillance sites maximises the probability of detecting an invading epidemic. Our approach reveals that it is not always optimal to target the highest-risk sites and that the optimal strategy differs depending on not only patterns of pathogen entry and spread but also the choice of detection method. That is, we find that spatial correlation in risk can make it suboptimal to focus solely on the highest-risk sites, meaning that it is best to avoid 'putting all your eggs in one basket'. However, this depends on an interplay with other factors, such as the sensitivity of available detection methods. Using the economically important arboreal disease huanglongbing (HLB), we demonstrate how our approach leads to a significant performance gain and cost saving in comparison with conventional methods to targeted surveillance.


Assuntos
Modelos Biológicos , Doenças das Plantas/microbiologia , Análise por Conglomerados , Simulação por Computador , Epidemias , Probabilidade , Fatores de Risco , Tamanho da Amostra
3.
J Theor Biol ; 560: 111385, 2023 03 07.
Artigo em Inglês | MEDLINE | ID: mdl-36565952

RESUMO

Early detection of invaders requires finding small numbers of individuals across large landscapes. It has been argued that the only feasible way to achieve the sampling effort needed for early detection of an invader is to involve volunteer groups (citizen scientists, passive surveyors, etc.). A key concern is that volunteers may have a considerable false-positive and false-negative rate. The question then becomes whether verification of a report from a volunteer is worth the effort. This question is the topic of this paper. Since we are interested in early detection we calculate the Z% upper limit of the one sided confidence interval of the incidence (fraction infected) and use the term maximum expected plausible incidence for this. We compare the maximum plausible incidence when the expert samples on their own, qE∼, and the maximum plausible incidence when the expert only verifies cases reported by the volunteer surveyor to be infected, qV∼. The maximum plausible incidences qE∼ and qV∼. are related as, qV∼=θfp1-θfnqE∼ where θfp and θfn are the false positive and false negative rate of the volunteer surveyor, respectively. We also show that the optimal monitoring programme consists of verifying only the cases reported by the volunteer surveyor if, TXTN<θfp1-θfn, where TN is the time needed for a sample taken by the expert and TX is the time needed for an expert to verify a case reported by a volunteer surveyor. Our results can be used to calculate the maximum plausible incidence of a plant disease based on reports of passive surveyors that have been verified by experts and data from experts sampling on their own. The results can also be used in the development phase of a surveillance project to assess whether including passive surveyor reports is useful in the early detection of exotic invaders.


Assuntos
Voluntários , Humanos
4.
Plant Mol Biol ; 109(3): 325-349, 2022 Jun.
Artigo em Inglês | MEDLINE | ID: mdl-34313932

RESUMO

KEY MESSAGE: We summarise modelling studies of the most economically important cassava diseases and arthropods, highlighting research gaps where modelling can contribute to the better management of these in the areas of surveillance, control, and host-pest dynamics understanding the effects of climate change and future challenges in modelling. For over 30 years, experimental and theoretical studies have sought to better understand the epidemiology of cassava diseases and arthropods that affect production and lead to considerable yield loss, to detect and control them more effectively. In this review, we consider the contribution of modelling studies to that understanding. We summarise studies of the most economically important cassava pests, including cassava mosaic disease, cassava brown streak disease, the cassava mealybug, and the cassava green mite. We focus on conceptual models of system dynamics rather than statistical methods. Through our analysis we identified areas where modelling has contributed and areas where modelling can improve and further contribute. Firstly, we identify research challenges in the modelling developed for the surveillance, detection and control of cassava pests, and propose approaches to overcome these. We then look at the contributions that modelling has accomplished in the understanding of the interaction and dynamics of cassava and its' pests, highlighting success stories and areas where improvement is needed. Thirdly, we look at the possibility that novel modelling applications can achieve to provide insights into the impacts and uncertainties of climate change. Finally, we identify research gaps, challenges, and opportunities where modelling can develop and contribute for the management of cassava pests, highlighting the recent advances in understanding molecular mechanisms of plant defence.


Assuntos
Manihot , Controle de Pragas , Doenças das Plantas
5.
Neurocrit Care ; 37(1): 302-313, 2022 08.
Artigo em Inglês | MEDLINE | ID: mdl-35469391

RESUMO

BACKGROUND: Despite application of the multimodal European Resuscitation Council and European Society of Intensive Care Medicine algorithm, neurological prognosis of patients who remain comatose after cardiac arrest remains uncertain in a large group of patients. In this study, we investigate the additional predictive value of visual and quantitative brain magnetic resonance imaging (MRI) to electroencephalography (EEG) for outcome estimation of comatose patients after cardiac arrest. METHODS: We performed a prospective multicenter cohort study in patients after cardiac arrest submitted in a comatose state to the intensive care unit of two Dutch hospitals. Continuous EEG was recorded during the first 3 days and MRI was performed at 3 ± 1 days after cardiac arrest. EEG at 24 h and ischemic damage in 21 predefined brain regions on diffusion weighted imaging and fluid-attenuated inversion recovery on a scale from 0 to 4 were related to outcome. Quantitative MRI analyses included mean apparent diffusion coefficient (ADC) and percentage of brain volume with ADC < 450 × 10-6 mm2/s, < 550 × 10-6 mm2/s, and < 650 × 10-6 mm2/s. Poor outcome was defined as a Cerebral Performance Category score of 3-5 at 6 months. RESULTS: We included 50 patients, of whom 20 (40%) demonstrated poor outcome. Visual EEG assessment correctly identified 3 (15%) with poor outcome and 15 (50%) with good outcome. Visual grading of MRI identified 13 (65%) with poor outcome and 25 (89%) with good outcome. ADC analysis identified 11 (55%) with poor outcome and 3 (11%) with good outcome. EEG and MRI combined could predict poor outcome in 16 (80%) patients at 100% specificity, and good outcome in 24 (80%) at 63% specificity. Ischemic damage was most prominent in the cortical gray matter (75% vs. 7%) and deep gray nuclei (45% vs. 3%) in patients with poor versus good outcome. CONCLUSIONS: Magnetic resonance imaging is complementary with EEG for the prediction of poor and good outcome of patients after cardiac arrest who are comatose at admission.


Assuntos
Coma , Parada Cardíaca , Estudos de Coortes , Coma/diagnóstico por imagem , Coma/etiologia , Eletroencefalografia/métodos , Parada Cardíaca/complicações , Parada Cardíaca/diagnóstico por imagem , Parada Cardíaca/terapia , Humanos , Prognóstico , Estudos Prospectivos
6.
PLoS Comput Biol ; 16(2): e1007570, 2020 02.
Artigo em Inglês | MEDLINE | ID: mdl-32027649

RESUMO

Diseases in humans, animals and plants remain an important challenge in our society. Effective control of invasive pathogens often requires coordinated concerted action of a large group of stakeholders. Both epidemiological and human behavioural factors influence the outcome of a disease control campaign. In mathematical models that are frequently used to guide such campaigns, human behaviour is often ill-represented, if at all. Existing models of human, animal and plant disease that do incorporate participation or compliance are often driven by pay-offs or direct observations of the disease state. It is however very well known that opinion is an important driving factor of human decision making. Here we consider the case study of Citrus Huanglongbing disease (HLB), which is an acute bacterial disease that threatens the sustainability of citrus production across the world. We show how by coupling an epidemiological model of this invasive disease with an opinion dynamics model we are able to answer the question: What makes or breaks the effectiveness of a disease control campaign? Frequent contact between stakeholders and advisors is shown to increase the probability of successful control. More surprisingly, we show that informing stakeholders about the effectiveness of control methods is of much greater importance than prematurely increasing their perceptions of the risk of infection. We discuss the overarching consequences of this finding and the effect on human as well as plant disease epidemics.


Assuntos
Citrus/microbiologia , Doenças das Plantas/prevenção & controle , Rhizobiaceae/patogenicidade , Surtos de Doenças , Modelos Teóricos , Doenças das Plantas/microbiologia , Estações do Ano
7.
Phytopathology ; 111(11): 1952-1962, 2021 Nov.
Artigo em Inglês | MEDLINE | ID: mdl-33856231

RESUMO

Cassava (Manihot esculenta) is an important food crop across sub-Saharan Africa, where production is severely inhibited by two viral diseases, cassava mosaic disease (CMD) and cassava brown streak disease (CBSD), both propagated by a whitefly vector and via human-mediated movement of infected cassava stems. There is limited information on growers' behavior related to movement of planting material, as well as growers' perception and awareness of cassava diseases, despite the importance of these factors for disease control. This study surveyed a total of 96 cassava subsistence growers and their fields across five provinces in Zambia between 2015 and 2017 to address these knowledge gaps. CMD symptoms were observed in 81.6% of the fields, with an average incidence of 52% across the infected fields. No CBSD symptoms were observed. Most growers used planting materials from their own (94%) or nearby (<10 km) fields of family and friends, although several large transactions over longer distances (10 to 350 km) occurred with friends (15 transactions), markets (1), middlemen (5), and nongovernmental organizations (6). Information related to cassava diseases and certified clean (disease-free) seed reached only 48% of growers. The most frequent sources of information related to cassava diseases included nearby friends, family, and neighbors, while extension workers were the most highly preferred source of information. These data provide a benchmark on which to plan management approaches to controlling CMD and CBSD, which should include clean propagation material, increasing growers' awareness of the diseases, and increasing information provided to farmers (specifically disease symptom recognition and disease management options).[Formula: see text] Copyright © 2021 The Author(s). This is an open access article distributed under the CC BY 4.0 International license.


Assuntos
Agricultura/métodos , Hemípteros , Manihot , Doenças das Plantas , Animais , Doenças das Plantas/prevenção & controle , Doenças das Plantas/virologia , Zâmbia
8.
BMC Geriatr ; 21(1): 58, 2021 01 14.
Artigo em Inglês | MEDLINE | ID: mdl-33446116

RESUMO

BACKGROUND: In many cases, life-sustaining treatment preferences are not timely discussed with older patients. Advance care planning (ACP) offers medical professionals an opportunity to discuss patients' preferences. We assessed how often these preferences were known when older patients were referred to the emergency department (ED) for an acute geriatric assessment. METHODS: We conducted a descriptive study on patients referred to the ED for an acute geriatric assessment in a Dutch hospital. Patients were referred by general practitioners (GPs), or in the case of nursing home residents, by elderly care physicians. The referring physician was asked if preferences regarding life-sustaining treatments were known. The primary outcome was the number of patients for whom preferences were known. Secondary outcomes included which preferences, and which variables predict known preferences. RESULTS: Between 2015 and 2017, 348 patients were included in our study. At least one preference regarding life-sustaining treatments was known at referral in 45.4% (158/348) cases. In these cases, cardiopulmonary resuscitation (CPR) policy was always included. Preferences regarding invasive ventilation policy and ICU admission were known in 17% (59/348) and 10.3% (36/348) of the cases respectively. Known preferences were more frequent in cases referred by the elderly care physician than the GP (P < 0.001). CONCLUSIONS: In less than half the patients, at least one preference regarding life-sustaining treatments was known at the time of referral to the ED for an acute geriatric assessment; in most cases it concerned CPR policy. We recommend optimizing ACP conversations in a non-acute setting to provide more appropriate, desired, and personalized care to older patients referred to the ED.


Assuntos
Planejamento Antecipado de Cuidados , Idoso , Serviço Hospitalar de Emergência , Hospitais , Humanos , Preferência do Paciente , Encaminhamento e Consulta
9.
Ann Neurol ; 86(2): 203-214, 2019 08.
Artigo em Inglês | MEDLINE | ID: mdl-31155751

RESUMO

OBJECTIVE: To provide evidence that early electroencephalography (EEG) allows for reliable prediction of poor or good outcome after cardiac arrest. METHODS: In a 5-center prospective cohort study, we included consecutive, comatose survivors of cardiac arrest. Continuous EEG recordings were started as soon as possible and continued up to 5 days. Five-minute EEG epochs were assessed by 2 reviewers, independently, at 8 predefined time points from 6 hours to 5 days after cardiac arrest, blinded for patients' actual condition, treatment, and outcome. EEG patterns were categorized as generalized suppression (<10 µV), synchronous patterns with ≥50% suppression, continuous, or other. Outcome at 6 months was categorized as good (Cerebral Performance Category [CPC] = 1-2) or poor (CPC = 3-5). RESULTS: We included 850 patients, of whom 46% had a good outcome. Generalized suppression and synchronous patterns with ≥50% suppression predicted poor outcome without false positives at ≥6 hours after cardiac arrest. Their summed sensitivity was 0.47 (95% confidence interval [CI] = 0.42-0.51) at 12 hours and 0.30 (95% CI = 0.26-0.33) at 24 hours after cardiac arrest, with specificity of 1.00 (95% CI = 0.99-1.00) at both time points. At 36 hours or later, sensitivity for poor outcome was ≤0.22. Continuous EEG patterns at 12 hours predicted good outcome, with sensitivity of 0.50 (95% CI = 0.46-0.55) and specificity of 0.91 (95% CI = 0.88-0.93); at 24 hours or later, specificity for the prediction of good outcome was <0.90. INTERPRETATION: EEG allows for reliable prediction of poor outcome after cardiac arrest, with maximum sensitivity in the first 24 hours. Continuous EEG patterns at 12 hours after cardiac arrest are associated with good recovery. ANN NEUROL 2019;86:203-214.


Assuntos
Coma/diagnóstico , Coma/fisiopatologia , Eletroencefalografia/métodos , Parada Cardíaca/diagnóstico , Parada Cardíaca/fisiopatologia , Idoso , Estudos de Coortes , Coma/etiologia , Feminino , Parada Cardíaca/complicações , Humanos , Masculino , Pessoa de Meia-Idade , Valor Preditivo dos Testes , Estudos Prospectivos , Resultado do Tratamento
10.
Anesthesiology ; 132(4): 781-794, 2020 04.
Artigo em Inglês | MEDLINE | ID: mdl-31977519

RESUMO

BACKGROUND: Mechanical complications arising after central venous catheter placement are mostly malposition or pneumothorax. To date, to confirm correct position and detect pneumothorax, chest x-ray film has been the reference standard, while ultrasound might be an accurate alternative. The aim of this study was to evaluate diagnostic accuracy of ultrasound to detect central venous catheter malposition and pneumothorax. METHODS: This was a prospective, multicenter, diagnostic accuracy study conducted at the intensive care unit and postanesthesia care unit. Adult patients who underwent central venous catheterization of the internal jugular vein or subclavian vein were included. Index test consisted of venous, cardiac, and lung ultrasound. Standard reference test was chest x-ray film. Primary outcome was diagnostic accuracy of ultrasound to detect malposition and pneumothorax; for malposition, sensitivity, specificity, and other accuracy parameters were estimated. For pneumothorax, because chest x-ray film is an inaccurate reference standard to diagnose it, agreement and Cohen's κ-coefficient were determined. Secondary outcomes were accuracy of ultrasound to detect clinically relevant complications and feasibility of ultrasound. RESULTS: In total, 758 central venous catheterizations were included. Malposition occurred in 23 (3.3%) out of 688 cases included in the analysis. Ultrasound sensitivity was 0.70 (95% CI, 0.49 to 0.86) and specificity 0.99 (95% CI, 0.98 to 1.00). Pneumothorax occurred in 5 (0.7%) to 11 (1.5%) out of 756 cases according to chest x-ray film and ultrasound, respectively. In 748 out of 756 cases (98.9%), there was agreement between ultrasound and chest x-ray film with a Cohen's κ-coefficient of 0.50 (95% CI, 0.19 to 0.80). CONCLUSIONS: This multicenter study shows that the complication rate of central venous catheterization is low and that ultrasound produces a moderate sensitivity and high specificity to detect malposition. There is moderate agreement with chest x-ray film for pneumothorax. In conclusion, ultrasound is an accurate diagnostic modality to detect malposition and pneumothorax.


Assuntos
Cateterismo Venoso Central/efeitos adversos , Cateterismo Venoso Central/normas , Cateteres Venosos Centrais/efeitos adversos , Cateteres Venosos Centrais/normas , Ultrassonografia de Intervenção/métodos , Ultrassonografia de Intervenção/normas , Idoso , Feminino , Humanos , Masculino , Pessoa de Meia-Idade , Estudos Prospectivos
11.
J Theor Biol ; 503: 110383, 2020 10 21.
Artigo em Inglês | MEDLINE | ID: mdl-32569611

RESUMO

The use of insecticides to control agricultural pests has resulted in resistance developing to most known insecticidal modes of action. Strategies by which resistance can be slowed are necessary to prolong the effectiveness of the remaining modes of action. Here we use a flexible mathematical model of resistance evolution to compare four insecticide application strategies: (i) applying one insecticide until failure, then switching to a second insecticide (sequential application), (ii) mixing two insecticides at their full label doses, (iii) rotating (alternating) two insecticides at full label dose, or (iv) mixing two insecticides at a reduced dose (with each mixture component at half the full label dose). The model represents target-site resistance. Multiple simulations were run representing different insect life-histories and insecticide characteristics. The analysis shows that none of the strategies examined were optimal for all the simulations. The four strategies: reduced dose mixture, label dose mixture, sequential application and label dose rotation, were optimal in 52%, 22%, 20% and 6% of simulations respectively. The most important trait determining the optimal strategy in a single simulation was whether or not the insect pest underwent sexual reproduction. For asexual insects, sequential application was most frequently the optimal strategy, while a label-dose mixture was rarely optimal. Conversely, for sexual insects a mixture was nearly always the optimal strategy, with reduced dose mixture being optimal twice as frequently as label dose mixture. When sequential application of insecticides is not an option, reduced dose mixture is most frequently the optimal strategy whatever an insect's reproduction.


Assuntos
Resistência a Inseticidas , Inseticidas , Agricultura , Animais , Insetos , Inseticidas/farmacologia
12.
JAMA ; 324(24): 2509-2520, 2020 12 22.
Artigo em Inglês | MEDLINE | ID: mdl-33295981

RESUMO

Importance: It is uncertain whether invasive ventilation can use lower positive end-expiratory pressure (PEEP) in critically ill patients without acute respiratory distress syndrome (ARDS). Objective: To determine whether a lower PEEP strategy is noninferior to a higher PEEP strategy regarding duration of mechanical ventilation at 28 days. Design, Setting, and Participants: Noninferiority randomized clinical trial conducted from October 26, 2017, through December 17, 2019, in 8 intensive care units (ICUs) in the Netherlands among 980 patients without ARDS expected not to be extubated within 24 hours after start of ventilation. Final follow-up was conducted in March 2020. Interventions: Participants were randomized to receive invasive ventilation using either lower PEEP, consisting of the lowest PEEP level between 0 and 5 cm H2O (n = 476), or higher PEEP, consisting of a PEEP level of 8 cm H2O (n = 493). Main Outcomes and Measures: The primary outcome was the number of ventilator-free days at day 28, with a noninferiority margin for the difference in ventilator-free days at day 28 of -10%. Secondary outcomes included ICU and hospital lengths of stay; ICU, hospital, and 28- and 90-day mortality; development of ARDS, pneumonia, pneumothorax, severe atelectasis, severe hypoxemia, or need for rescue therapies for hypoxemia; and days with use of vasopressors or sedation. Results: Among 980 patients who were randomized, 969 (99%) completed the trial (median age, 66 [interquartile range {IQR}, 56-74] years; 246 [36%] women). At day 28, 476 patients in the lower PEEP group had a median of 18 ventilator-free days (IQR, 0-27 days) and 493 patients in the higher PEEP group had a median of 17 ventilator-free days (IQR, 0-27 days) (mean ratio, 1.04; 95% CI, 0.95-∞; P = .007 for noninferiority), and the lower boundary of the 95% CI was within the noninferiority margin. Occurrence of severe hypoxemia was 20.6% vs 17.6% (risk ratio, 1.17; 95% CI, 0.90-1.51; P = .99) and need for rescue strategy was 19.7% vs 14.6% (risk ratio, 1.35; 95% CI, 1.02-1.79; adjusted P = .54) in patients in the lower and higher PEEP groups, respectively. Mortality at 28 days was 38.4% vs 42.0% (hazard ratio, 0.89; 95% CI, 0.73-1.09; P = .99) in patients in the lower and higher PEEP groups, respectively. There were no statistically significant differences in other secondary outcomes. Conclusions and Relevance: Among patients in the ICU without ARDS who were expected not to be extubated within 24 hours, a lower PEEP strategy was noninferior to a higher PEEP strategy with regard to the number of ventilator-free days at day 28. These findings support the use of lower PEEP in patients without ARDS. Trial Registration: ClinicalTrials.gov Identifier: NCT03167580.


Assuntos
Respiração com Pressão Positiva/métodos , Insuficiência Respiratória/terapia , APACHE , Idoso , Estado Terminal , Feminino , Humanos , Unidades de Terapia Intensiva , Estimativa de Kaplan-Meier , Tempo de Internação , Masculino , Pessoa de Meia-Idade , Oxigênio/sangue , Pneumonia Associada à Ventilação Mecânica , Pneumotórax/etiologia , Respiração com Pressão Positiva/efeitos adversos , Desmame do Respirador
13.
Crit Care Med ; 47(10): 1424-1432, 2019 10.
Artigo em Inglês | MEDLINE | ID: mdl-31162190

RESUMO

OBJECTIVES: Visual assessment of the electroencephalogram by experienced clinical neurophysiologists allows reliable outcome prediction of approximately half of all comatose patients after cardiac arrest. Deep neural networks hold promise to achieve similar or even better performance, being more objective and consistent. DESIGN: Prospective cohort study. SETTING: Medical ICU of five teaching hospitals in the Netherlands. PATIENTS: Eight-hundred ninety-five consecutive comatose patients after cardiac arrest. INTERVENTIONS: None. MEASUREMENTS AND MAIN RESULTS: Continuous electroencephalogram was recorded during the first 3 days after cardiac arrest. Functional outcome at 6 months was classified as good (Cerebral Performance Category 1-2) or poor (Cerebral Performance Category 3-5). We trained a convolutional neural network, with a VGG architecture (introduced by the Oxford Visual Geometry Group), to predict neurologic outcome at 12 and 24 hours after cardiac arrest using electroencephalogram epochs and outcome labels as inputs. Output of the network was the probability of good outcome. Data from two hospitals were used for training and internal validation (n = 661). Eighty percent of these data was used for training and cross-validation, the remaining 20% for independent internal validation. Data from the other three hospitals were used for external validation (n = 234). Prediction of poor outcome was most accurate at 12 hours, with a sensitivity in the external validation set of 58% (95% CI, 51-65%) at false positive rate of 0% (CI, 0-7%). Good outcome could be predicted at 12 hours with a sensitivity of 48% (CI, 45-51%) at a false positive rate of 5% (CI, 0-15%) in the external validation set. CONCLUSIONS: Deep learning of electroencephalogram signals outperforms any previously reported outcome predictor of coma after cardiac arrest, including visual electroencephalogram assessment by trained electroencephalogram experts. Our approach offers the potential for objective and real time, bedside insight in the neurologic prognosis of comatose patients after cardiac arrest.


Assuntos
Coma/diagnóstico , Aprendizado Profundo , Eletroencefalografia , Idoso , Coma/etiologia , Feminino , Parada Cardíaca/complicações , Humanos , Hipóxia Encefálica/complicações , Masculino , Pessoa de Meia-Idade , Valor Preditivo dos Testes , Estudos Prospectivos
14.
J Theor Biol ; 461: 8-16, 2019 01 14.
Artigo em Inglês | MEDLINE | ID: mdl-30342894

RESUMO

Monitoring for disease requires subsets of the host population to be sampled and tested for the pathogen. If all the samples return healthy, what are the chances the disease was present but missed? In this paper, we developed a statistical approach to solve this problem considering the fundamental property of infectious diseases: their growing incidence in the host population. The model gives an estimate of the incidence probability density as a function of the sampling effort, and can be reversed to derive adequate monitoring patterns ensuring a given maximum incidence in the population. We then present an approximation of this model, providing a simple rule of thumb for practitioners. The approximation is shown to be accurate for a sample size larger than 20, and we demonstrate its use by applying it to three plant pathogens: citrus canker, bacterial blight and grey mould.


Assuntos
Doenças Transmissíveis/epidemiologia , Epidemias/estatística & dados numéricos , Monitoramento Epidemiológico , Incidência , Modelos Estatísticos , Animais , Humanos , Doenças das Plantas/microbiologia , Probabilidade , Tamanho da Amostra
15.
PLoS Comput Biol ; 13(7): e1005654, 2017 Jul.
Artigo em Inglês | MEDLINE | ID: mdl-28746374

RESUMO

Trade or sharing that moves infectious planting material between farms can, for vertically-transmitted plant diseases, act as a significant force for dispersal of pathogens, particularly where the extent of material movement may be greater than that of infected vectors or inoculum. The network over which trade occurs will then effect dispersal, and is important to consider when attempting to control the disease. We consider the difference that planting material exchange can make to successful control of cassava brown streak disease, an important viral disease affecting one of Africa's staple crops. We use a mathematical model of smallholders' fields to determine the effect of informal trade on both the spread of the pathogen and its control using clean-seed systems, determining aspects that could limit the damage caused by the disease. In particular, we identify the potentially detrimental effects of markets, and the benefits of a community-based approach to disease control.


Assuntos
Produtos Agrícolas , Interações Hospedeiro-Patógeno , Doenças das Plantas , Biologia Computacional , Fazendas , Doenças das Plantas/prevenção & controle , Doenças das Plantas/virologia , Sementes/virologia
16.
PLoS Comput Biol ; 13(8): e1005712, 2017 Aug.
Artigo em Inglês | MEDLINE | ID: mdl-28846676

RESUMO

The spread of pathogens into new environments poses a considerable threat to human, animal, and plant health, and by extension, human and animal wellbeing, ecosystem function, and agricultural productivity, worldwide. Early detection through effective surveillance is a key strategy to reduce the risk of their establishment. Whilst it is well established that statistical and economic considerations are of vital importance when planning surveillance efforts, it is also important to consider epidemiological characteristics of the pathogen in question-including heterogeneities within the epidemiological system itself. One of the most pronounced realisations of this heterogeneity is seen in the case of vector-borne pathogens, which spread between 'hosts' and 'vectors'-with each group possessing distinct epidemiological characteristics. As a result, an important question when planning surveillance for emerging vector-borne pathogens is where to place sampling resources in order to detect the pathogen as early as possible. We answer this question by developing a statistical function which describes the probability distributions of the prevalences of infection at first detection in both hosts and vectors. We also show how this method can be adapted in order to maximise the probability of early detection of an emerging pathogen within imposed sample size and/or cost constraints, and demonstrate its application using two simple models of vector-borne citrus pathogens. Under the assumption of a linear cost function, we find that sampling costs are generally minimised when either hosts or vectors, but not both, are sampled.


Assuntos
Transmissão de Doença Infecciosa , Vetores de Doenças , Monitoramento Epidemiológico , Modelos Biológicos , Modelos Estatísticos , Animais , Biologia Computacional , Doenças das Plantas
17.
Phytopathology ; 108(7): 803-817, 2018 Jul.
Artigo em Inglês | MEDLINE | ID: mdl-29377769

RESUMO

Whether fungicide resistance management is optimized by spraying chemicals with different modes of action as a mixture (i.e., simultaneously) or in alternation (i.e., sequentially) has been studied by experimenters and modelers for decades. However, results have been inconclusive. We use previously parameterized and validated mathematical models of wheat Septoria leaf blotch and grapevine powdery mildew to test which tactic provides better resistance management, using the total yield before resistance causes disease control to become economically ineffective ("lifetime yield") to measure effectiveness. We focus on tactics involving the combination of a low-risk and a high-risk fungicide, and the case in which resistance to the high-risk chemical is complete (i.e., in which there is no partial resistance). Lifetime yield is then optimized by spraying as much low-risk fungicide as is permitted, combined with slightly more high-risk fungicide than needed for acceptable initial disease control, applying these fungicides as a mixture. That mixture rather than alternation gives better performance is invariant to model parameterization and structure, as well as the pathosystem in question. However, if comparison focuses on other metrics, e.g., lifetime yield at full label dose, either mixture or alternation can be optimal. Our work shows how epidemiological principles can explain the evolution of fungicide resistance, and also highlights a theoretical framework to address the question of whether mixture or alternation provides better resistance management. It also demonstrates that precisely how spray tactics are compared must be given careful consideration. [Formula: see text] Copyright © 2018 The Author(s). This is an open access article distributed under the CC BY 4.0 International license .


Assuntos
Ascomicetos/efeitos dos fármacos , Fungicidas Industriais/administração & dosagem , Fungicidas Industriais/farmacologia , Doenças das Plantas/microbiologia , Relação Dose-Resposta a Droga , Farmacorresistência Fúngica , Modelos Biológicos , Triticum/microbiologia
18.
JAMA ; 319(10): 993-1001, 2018 03 13.
Artigo em Inglês | MEDLINE | ID: mdl-29486489

RESUMO

Importance: It remains uncertain whether nebulization of mucolytics with bronchodilators should be applied for clinical indication or preventively in intensive care unit (ICU) patients receiving invasive ventilation. Objective: To determine if a strategy that uses nebulization for clinical indication (on-demand) is noninferior to one that uses preventive (routine) nebulization. Design, Setting, and Participants: Randomized clinical trial enrolling adult patients expected to need invasive ventilation for more than 24 hours at 7 ICUs in the Netherlands. Interventions: On-demand nebulization of acetylcysteine or salbutamol (based on strict clinical indications, n = 471) or routine nebulization of acetylcysteine with salbutamol (every 6 hours until end of invasive ventilation, n = 473). Main Outcomes and Measures: The primary outcome was the number of ventilator-free days at day 28, with a noninferiority margin for a difference between groups of -0.5 days. Secondary outcomes included length of stay, mortality rates, occurrence of pulmonary complications, and adverse events. Results: Nine hundred twenty-two patients (34% women; median age, 66 (interquartile range [IQR], 54-75 years) were enrolled and completed follow-up. At 28 days, patients in the on-demand group had a median 21 (IQR, 0-26) ventilator-free days, and patients in the routine group had a median 20 (IQR, 0-26) ventilator-free days (1-sided 95% CI, -0.00003 to ∞). There was no significant difference in length of stay or mortality, or in the proportion of patients developing pulmonary complications, between the 2 groups. Adverse events (13.8% vs 29.3%; difference, -15.5% [95% CI, -20.7% to -10.3%]; P < .001) were more frequent with routine nebulization and mainly related to tachyarrhythmia (12.5% vs 25.9%; difference, -13.4% [95% CI, -18.4% to -8.4%]; P < .001) and agitation (0.2% vs 4.3%; difference, -4.1% [95% CI, -5.9% to -2.2%]; P < .001). Conclusions and Relevance: Among ICU patients receiving invasive ventilation who were expected to not be extubated within 24 hours, on-demand compared with routine nebulization of acetylcysteine with salbutamol did not result in an inferior number of ventilator-free days. On-demand nebulization may be a reasonable alternative to routine nebulization. Trial Registration: clinicaltrials.gov Identifier: NCT02159196.


Assuntos
Acetilcisteína/administração & dosagem , Albuterol/administração & dosagem , Cuidados Críticos , Nebulizadores e Vaporizadores , Respiração Artificial , Administração por Inalação , Adulto , Idoso , Feminino , Humanos , Unidades de Terapia Intensiva , Tempo de Internação , Masculino , Pessoa de Meia-Idade , Resultado do Tratamento , Desmame do Respirador
19.
Proc Biol Sci ; 284(1859)2017 Jul 26.
Artigo em Inglês | MEDLINE | ID: mdl-28724732

RESUMO

The number of emerging tree diseases has increased rapidly in recent times, with severe environmental and economic consequences. Systematic regulatory surveys to detect and establish the distribution of pests are crucial for successful management efforts, but resource-intensive and costly. Volunteers who identify potential invasive species can form an important early warning network in tree health; however, what these data can tell us and how they can be best used to inform and direct official survey effort is not clear. Here, we use an extensive dataset on acute oak decline (AOD) as an opportunity to ask how verified data received from the public can be used. Information on the distribution of AOD was available as (i) systematic regulatory surveys conducted throughout England and Wales, and (ii) ad hoc sightings reported by landowners, land managers and members of the public (i.e. 'self-reported' cases). By using the available self-reported cases at the design stage, the systematic survey could focus on defining the boundaries of the affected area. This maximized the use of available resources and highlights the benefits to be gained by developing strategies to enhance volunteer efforts in future programmes.


Assuntos
Coleta de Dados/métodos , Doenças das Plantas , Quercus , Pesquisa Participativa Baseada na Comunidade , Conservação dos Recursos Naturais , Inglaterra , Agricultura Florestal , Florestas , Inquéritos e Questionários , País de Gales
20.
Proc Biol Sci ; 284(1863)2017 Sep 27.
Artigo em Inglês | MEDLINE | ID: mdl-28931732

RESUMO

Cultivar resistance is an essential part of disease control programmes in many agricultural systems. The use of resistant cultivars applies a selection pressure on pathogen populations for the evolution of virulence, resulting in loss of disease control. Various techniques for the deployment of host resistance genes have been proposed to reduce the selection for virulence, but these are often difficult to apply in practice. We present a general technique to maintain the effectiveness of cultivar resistance. Derived from classical population genetics theory; any factor that reduces the population growth rates of both the virulent and avirulent strains will reduce selection. We model the specific example of fungicide application to reduce the growth rates of virulent and avirulent strains of a pathogen, demonstrating that appropriate use of fungicides reduces selection for virulence, prolonging cultivar resistance. This specific example of chemical control illustrates a general principle for the development of techniques to manage the evolution of virulence by slowing epidemic growth rates.


Assuntos
Agricultura , Produtos Agrícolas/genética , Resistência à Doença/genética , Fungos/patogenicidade , Doenças das Plantas/genética , Fungos/efeitos dos fármacos , Fungos/genética , Fungicidas Industriais , Genética Populacional , Doenças das Plantas/microbiologia , Seleção Genética , Virulência
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA